计算机科学
时间戳
系列(地层学)
人工智能
编码(集合论)
学习迁移
对比度(视觉)
自然语言处理
机器学习
模式识别(心理学)
计算机安全
集合(抽象数据类型)
生物
古生物学
程序设计语言
作者
Seunghan Lee,Taeyoung Park,Kibok Lee
出处
期刊:Cornell University - arXiv
日期:2023-12-27
被引量:3
标识
DOI:10.48550/arxiv.2312.16424
摘要
Contrastive learning has shown to be effective to learn representations from time series in a self-supervised way. However, contrasting similar time series instances or values from adjacent timestamps within a time series leads to ignore their inherent correlations, which results in deteriorating the quality of learned representations. To address this issue, we propose SoftCLT, a simple yet effective soft contrastive learning strategy for time series. This is achieved by introducing instance-wise and temporal contrastive loss with soft assignments ranging from zero to one. Specifically, we define soft assignments for 1) instance-wise contrastive loss by the distance between time series on the data space, and 2) temporal contrastive loss by the difference of timestamps. SoftCLT is a plug-and-play method for time series contrastive learning that improves the quality of learned representations without bells and whistles. In experiments, we demonstrate that SoftCLT consistently improves the performance in various downstream tasks including classification, semi-supervised learning, transfer learning, and anomaly detection, showing state-of-the-art performance. Code is available at this repository: https://github.com/seunghan96/softclt.
科研通智能强力驱动
Strongly Powered by AbleSci AI