计算机科学
人工智能
自然语言处理
语义相似性
深度学习
残余物
聚类分析
相似性(几何)
代表(政治)
机器学习
判别式
特征学习
模式识别(心理学)
算法
图像(数学)
政治
政治学
法学
作者
Zhiwen Xiao,Huanlai Xing,Bowen Zhao,Rong Qu,Shouxi Luo,Penglin Dai,Ke Li,Zonghai Zhu
出处
期刊:IEEE transactions on emerging topics in computational intelligence
[Institute of Electrical and Electronics Engineers]
日期:2024-02-01
卷期号:8 (1): 3-15
被引量:17
标识
DOI:10.1109/tetci.2023.3304948
摘要
Recently, contrastive learning (CL) is a promising way of learning discriminative representations from time series data. In the representation hierarchy, semantic information extracted from lower levels is the basis of that captured from higher levels. Low-level semantic information is essential and should be considered in the CL process. However, the existing CL algorithms mainly focus on the similarity of high-level semantic information. Considering the similarity of low-level semantic information may improve the performance of CL. To this end, we present a deep contrastive representation learning with self-distillation (DCRLS) for the time series domain. DCRLS gracefully combine data augmentation, deep contrastive learning, and self distillation. Our data augmentation provides different views from the same sample as the input of DCRLS. Unlike most CL algorithms that concentrate on high-level semantic information only, our deep contrastive learning also considers the contrast similarity of low-level semantic information between peer residual blocks. Our self distillation promotes knowledge flow from high-level to low-level blocks to help regularize DCRLS in the knowledge transfer process. The experimental results demonstrate that the DCRLS-based structures achieve excellent performance on classification and clustering on 36 UCR2018 datasets.
科研通智能强力驱动
Strongly Powered by AbleSci AI