结构风险最小化
支持向量机
统计学习理论
相关向量机
边缘分级机
结构化支持向量机
人工智能
核方法
计算机科学
二次规划
特征向量
机器学习
二次分类器
经验风险最小化
边距(机器学习)
多项式核
二元分类
径向基函数核
线性分类器
模式识别(心理学)
数学优化
数学
标识
DOI:10.1109/icaica54878.2022.9844516
摘要
The Support Vector methods was proposed by V.Vapnik in 1965, when he was trying to solve problems in pattern recognition. In 1971, Kimeldorf proposed a method of constructing kernel space based on support vectors. In 1990s, V.Vapnik formally introduced the Support Vector Machine (SVM) methods in Statistical Learning. Since then, SVM has been widely applied in pattern recognition, natural language process and so on. Informally, SVM is a binary classifier. The model is based on the linear classifier with the optimal margin in the feature space and thus the learning strategy is to maximize the margin, which can be transformed into a convex quadratic programming problem. It uses the principle of structural risk minimization instead of empirical risk minimization to fit small data samples. Kernel trick is used to transform non-linear sample space into linear space, decreasing the complexity of algorithm. Even though, it still has broader prospects for development.
科研通智能强力驱动
Strongly Powered by AbleSci AI