相互信息
机器学习
计算机科学
人工智能
成对比较
熵(时间箭头)
信息论
统计学习理论
香农信源编码定理
估计员
协方差
数学
支持向量机
最大熵原理
统计
二元熵函数
物理
量子力学
最大熵热力学
摘要
This book presents the first cohesive treatment of Information Theoretic Learning (ITL) algorithms to adapt linear or nonlinear learning machines both in supervised or unsupervised paradigms. ITL is a framework where the conventional concepts of second order statistics (covariance, L2 distances, correlation functions) are substituted by scalars and functions with information theoretic underpinnings, respectively entropy, mutual information and correntropy. ITL quantifies the stochastic structure of the data beyond second order statistics for improved performance without using full-blown Bayesian approaches that require a much larger computational cost. This is possible because of a non-parametric estimator of Renyis quadratic entropy that is only a function of pairwise differences between samples. The book compares the performance of ITL algorithms with the second order counterparts in many engineering and machine learning applications. Students, practitioners and researchers interested in statistical signal processing, computational intelligence, and machine learning will find in this book the theory to understand the basics, the algorithms to implement applications, and exciting but still unexplored leads that will provide fertile ground for future research.
科研通智能强力驱动
Strongly Powered by AbleSci AI