共轭梯度法
支持向量机
最小二乘支持向量机
计算机科学
二次规划
人工神经网络
统计学习理论
相关向量机
最小二乘函数近似
水准点(测量)
人工智能
序贯最小优化
算法
比例(比率)
模式识别(心理学)
数学
数学优化
量子力学
大地测量学
统计
物理
估计员
地理
作者
Johan A. K. Suykens,Luděk Lukáš,Paul Van Dooren,Joos Vandewalle
摘要
Support vector machines (SVM's) have been introduced in literature as a method for pattern recognition and function estimation, within the framework of statistical learning theory and structural risk minimization. A least squares version (LSSVM) has been recently reported which expresses the training in terms of solving a set of linear equations instead of quadratic programming as for the standard SVM case. In this paper we present an iterative training algorithm for LS-SVM's which is based on a conjugate gradient method. This enables solving large scale classification problems which is illustrated on a multi two-spiral benchmark problem. Keywords. Support vector machines, classification, neural networks, RBF kernels, conjugate gradient method.
科研通智能强力驱动
Strongly Powered by AbleSci AI