成对比较
计算机科学
分类器(UML)
杠杆(统计)
机器学习
铰链损耗
人工智能
模式识别(心理学)
数据挖掘
支持向量机
标识
DOI:10.1109/tnnls.2023.3290540
摘要
Multiview learning (MVL) concentrates on the problem where each instance is represented by multiple different feature sets. Efficiently exploring and exploiting the common and complementary information among different views remains challenging in MVL. Nevertheless, many existing algorithms deal with multiview problems via pairwise strategies, which limit the exploration of relationships among different views and dramatically increase the computational cost. In this article, we propose a multiview structural large margin classifier (MvSLMC) that simultaneously satisfies the consensus and complementarity principles in all views. Specifically, on the one hand, MvSLMC employs a structural regularization term to promote cohesion within-class and separability between-class in each view. On the other hand, different views provide extra structural information to each other, which favors the diversity of the classifier. Moreover, the introduction of hinge loss in MvSLMC results in sample sparsity, which we leverage to construct a safe screening rule (SSR) for accelerating MvSLMC. To the best of our knowledge, this is the first attempt at safe screening in MVL. Numerical experimental results demonstrate the effectiveness of MvSLMC and its safe acceleration method.
科研通智能强力驱动
Strongly Powered by AbleSci AI