计算机科学
聚类分析
人工智能
模式识别(心理学)
作者
Wei Wei,Jianguo Wu,Xinyao Guo,Jing Yan,Jiye Liang
标识
DOI:10.1109/tpami.2025.3600256
摘要
Existing clustering ensemble methods typically fuse all base clusterings in one shot under unsupervised settings, making it difficult to distinguish the quality of individual base clusterings and to exploit latent prior knowledge; consequently, their adaptability to data distributions and overall performance are limited. To address these issues, this paper proposes the Self-Constrained Clustering Ensemble (SCCE) algorithm. SCCE treats the pseudo-labels automatically generated from current clustering results as self-supervised signals and performs metric learning to obtain a linear transformation that enlarges inter-class distances while compressing intra-class distances. The base clusterings are then reclustered in the new metric space to enhance separability and consistency. Afterward, ensemble updating is iteratively applied, forming a self-driven closed loop that continuously improves model performance. Theoretical analysis shows that the model converges efficiently via alternating optimization, with computational complexity on the same order as mainstream methods. Experiments on public datasets demonstrate that the proposed algorithm significantly outperforms representative clustering ensemble approaches, validating its effectiveness and robustness in scenarios lacking external supervision.
科研通智能强力驱动
Strongly Powered by AbleSci AI