自编码
聚类分析
计算机科学
人工智能
深度学习
编码器
特征(语言学)
特征学习
机器学习
数据挖掘
模式识别(心理学)
语言学
操作系统
哲学
作者
Xifeng Guo,Xinwang Liu,En Zhu,Xiaoyan Zhu,Miaomiao Li,Xin Xu,Jianping Yin
出处
期刊:IEEE Transactions on Knowledge and Data Engineering
[Institute of Electrical and Electronics Engineers]
日期:2019-01-01
卷期号:: 1-1
被引量:41
标识
DOI:10.1109/tkde.2019.2911833
摘要
Deep clustering gains superior performance than conventional clustering by jointly performing feature learning and cluster assignment. Although numerous deep clustering algorithms have emerged in various applications, most of them fail to learn robust cluster-oriented features which in turn hurts the final clustering performance. To solve this problem, we propose a two-stage deep clustering algorithm by incorporating data augmentation and self-paced learning. Specifically, in the first stage, we learn robust features by training an autoencoder with examples that are augmented by random shifting and rotating the given clean examples. Then, in the second stage, we encourage the learned features to be cluster-oriented by alternatively finetuning the encoder with the augmented examples and updating the cluster assignments of the clean examples. During finetuning the encoder, the target of each augmented example in the loss function is the center of the cluster to which the clean example is assigned. The targets may be computed incorrectly, and the examples with incorrect targets could mislead the encoder network. To stabilize the network training, we select most confident examples in each iteration by utilizing the adaptive self-paced learning. Extensive experiments validate that our algorithm outperforms the state of the arts on four image datasets.
科研通智能强力驱动
Strongly Powered by AbleSci AI