典型相关
乙状窦函数
计算机科学
人工智能
非参数统计
非线性系统
相关性
核(代数)
参数统计
模式识别(心理学)
人工神经网络
算法
机器学习
数学
统计
离散数学
物理
几何学
量子力学
作者
Galen Andrew,Raman Arora,Jeff Bilmes,Karen Livescu
出处
期刊:International Conference on Machine Learning
日期:2013-06-16
卷期号:: 1247-1255
被引量:846
摘要
We introduce Deep Canonical Correlation Analysis (DCCA), a method to learn complex nonlinear transformations of two views of data such that the resulting representations are highly linearly correlated. Parameters of both transformations are jointly learned to maximize the (regularized) total correlation. It can be viewed as a nonlinear extension of the linear method canonical correlation analysis (CCA). It is an alternative to the nonparametric method kernel canonical correlation analysis (KCCA) for learning correlated nonlinear transformations. Unlike KCCA, DCCA does not require an inner product, and has the advantages of a parametric method: training time scales well with data size and the training data need not be referenced when computing the representations of unseen instances. In experiments on two real-world datasets, we find that DCCA learns representations with significantly higher correlation than those learned by CCA and KCCA. We also introduce a novel non-saturating sigmoid function based on the cube root that may be useful more generally in feedforward neural networks.
科研通智能强力驱动
Strongly Powered by AbleSci AI