支持向量机
计算机科学
铰链损耗
计算复杂性理论
分类器(UML)
离群值
计算
趋同(经济学)
人工智能
比例(比率)
构造(python库)
机器学习
算法
模式识别(心理学)
数据挖掘
数学优化
数学
物理
量子力学
经济
程序设计语言
经济增长
作者
Huajun Wang,Genghui Li,Zhenkun Wang
标识
DOI:10.1016/j.ins.2023.119136
摘要
Support vector machines (SVM), as one of effective and popular classification tools, have been widely applied in various fields. However, they may incur prohibitive computational costs when solving large-scale classification problems. To address this problem, we construct a new fast SVM with a truncated squared hinge loss (dubbed as Lts-SVM). We begin by developing an optimality theory of the nonconvex and nonsmooth Lts-SVM, which makes it convenient for us to investigate the support vectors and working set of Lts-SVM. Based on this, we propose a new and effective global convergence algorithm to address the Lts-SVM. This method is found to enjoy a tremendously low computational complexity, which makes sufficiently decreasing the demand for extremely large-scale computation possible. Numerical comparisons with eight other solvers show that our proposed algorithm achieves excellent performance on large-scale classification problems with regard to shorter computational times, more desirable accuracy levels, fewer support vectors and more robust to outliers.
科研通智能强力驱动
Strongly Powered by AbleSci AI