计算机科学
多标签分类
人工智能
相关性
等级制度
班级(哲学)
水准点(测量)
依赖关系(UML)
模式识别(心理学)
变压器
机器学习
任务(项目管理)
自然语言处理
数学
地理
物理
管理
电压
经济
几何学
量子力学
市场经济
大地测量学
作者
Kefan Ma,Zheng Huang,Xinrui Deng,Jie Guo,Weidong Qiu
标识
DOI:10.1109/icassp49357.2023.10096210
摘要
Multi-label text classification, which aims to predict the relevant labels for each given document, is one of the fundamental tasks of natural language processing. Recent studies have utilized Transformer, which embeds texts and class labels into a joint space to capture the label correlation. However, existing methods tend to take up extra input length and ignore the significance of taxonomic hierarchy. For this reason, we introduce a label correlation enhanced decoder (LED) for multi-label text classification. LED predicts the presence or absence of class labels in parallel with label representation and captures label correlation through multi-task learning. In addition, we propose a hierarchy-aware mask to capture the hierarchical dependency between labels. Comprehensive experiments on four benchmark datasets show that LED outperforms the state-of-the-art baselines. Detailed analysis validates the effectiveness of our proposed method.
科研通智能强力驱动
Strongly Powered by AbleSci AI