一般化
边距(机器学习)
趋同(经济学)
泛化误差
计算机科学
混合(物理)
度量(数据仓库)
核(代数)
样本复杂性
数学
计算复杂性理论
班级(哲学)
样品(材料)
算法
数学优化
离散数学
人工智能
机器学习
数据挖掘
人工神经网络
数学分析
经济
物理
化学
量子力学
经济增长
色谱法
作者
Mehryar Mohri,Afshin Rostamizadeh
出处
期刊:Neural Information Processing Systems
日期:2008-12-08
卷期号:21: 1097-1104
被引量:66
摘要
This paper presents the first Rademacher complexity-based error bounds for non-i.i.d. settings, a generalization of similar existing bounds derived for the i.i.d. case. Our bounds hold in the scenario of dependent samples generated by a stationary β-mixing process, which is commonly adopted in many previous studies of non-i.i.d. settings. They benefit from the crucial advantages of Rademacher complexity over other measures of the complexity of hypothesis classes. In particular, they are data-dependent and measure the complexity of a class of hypotheses based on the training sample. The empirical Rademacher complexity can be estimated from such finite samples and lead to tighter generalization bounds. We also present the first margin bounds for kernel-based classification in this non-i.i.d. setting and briefly study their convergence.
科研通智能强力驱动
Strongly Powered by AbleSci AI