最大熵
独立成分分析
负熵
算法
理论(学习稳定性)
参数化复杂度
简单(哲学)
神经计算模型
人工智能
模式识别(心理学)
盲信号分离
人工神经网络
计算机科学
数学
频道(广播)
机器学习
认识论
计算机网络
哲学
作者
Te-Won Lee,Mark Girolami,Terrence J. Sejnowski
标识
DOI:10.1162/089976699300016719
摘要
An extension of the infomax algorithm of Bell and Sejnowski (1995) is presented that is able blindly to separate mixed signals with sub- and supergaussian source distributions. This was achieved by using a simple type of learning rule first derived by Girolami (1997) by choosing negentropy as a projection pursuit index. Parameterized probability distributions that have sub- and supergaussian regimes were used to derive a general learning rule that preserves the simple architecture proposed by Bell and Sejnowski (1995), is optimized using the natural gradient by Amari (1998), and uses the stability analysis of Cardoso and Laheld (1996) to switch between sub- and supergaussian regimes. We demonstrate that the extended infomax algorithm is able to separate 20 sources with a variety of source distributions easily. Applied to high-dimensional data from electroencephalographic recordings, it is effective at separating artifacts such as eye blinks and line noise from weaker electrical signals that arise from sources in the brain.
科研通智能强力驱动
Strongly Powered by AbleSci AI