线性子空间
概率逻辑
计算机科学
嵌入
聚类分析
解算器
相似性(几何)
特征(语言学)
人工智能
子空间拓扑
趋同(经济学)
算法
模式识别(心理学)
数学
图像(数学)
语言学
哲学
几何学
经济
程序设计语言
经济增长
作者
Danyang Wu,Xia Dong,Jianfu Cao,Rong Wang,Feiping Nie,Xuelong Li
标识
DOI:10.1109/tnnls.2022.3217032
摘要
The existing multiview clustering models learn a consistent low-dimensional embedding either from multiple feature matrices or multiple similarity matrices, which ignores the interaction between the two procedures and limits the improvement of clustering performance on multiview data. To address this issue, a bidirectional probabilistic subspaces approximation (BPSA) model is developed in this article to learn a consistently orthogonal embedding from multiple feature matrices and multiple similarity matrices simultaneously via the disturbed probabilistic subspace modeling and approximation. A skillful bidirectional fusion strategy is designed to guarantee the parameter-free property of the BPSA model. Two adaptively weighted learning mechanisms are introduced to ensure the inconsistencies among multiple views and the inconsistencies between bidirectional learning processes. To solve the optimization problem involved in the BPSA model, an iterative solver is derived, and a rigorous convergence guarantee is provided. Extensive experimental results on both toy and real-world datasets demonstrate that our BPSA model achieves state-of-the-art performance even if it is parameter-free.
科研通智能强力驱动
Strongly Powered by AbleSci AI