计算机科学
水准点(测量)
人工智能
领域(数学分析)
任务(项目管理)
班级(哲学)
关系(数据库)
关系抽取
域适应
理论(学习稳定性)
机器学习
信息抽取
自然语言处理
数据挖掘
数学分析
数学
管理
大地测量学
分类器(UML)
经济
地理
作者
Yijun Liu,Feifei Dai,Xiaoyan Gu,Haihui Fan,Dong Liu,Bo Li,Weiping Wang
标识
DOI:10.1007/978-3-031-30678-5_10
摘要
Relation extraction (RE) is an important task in information extraction that has drawn much attention. Although many RE models have achieved impressive performance, their performance drops dramatically when adapting to the new domain and under few-shot scenarios. One reason is that the huge gap in semantic space between different domains makes the model obtain suboptimal representations in the new domain. The other is the inability to learn class-sensitive information with only a few samples, which makes the instances with confusing factors hard to be distinguished. To address these issues, we propose a Contrastive learning-based Fine-Tuning approach with Knowledge Enhancement (CFTKE) for the Domain Adaptation Few-Shot RE task (DAFSRE). Specifically, we fine-tune the model in a contrastive-learning way to refine the semantic space in the new domain, which can bridge the gap between different domains and obtain better representations. To enhance the stability and learning ability of contrastive learning-based fine-tuning, we design the data augmentation mechanism and type-aware networks to enrich the instances and stand out the class-sensitive features. Extensive experiments on the DAFSRE benchmark dataset demonstrate that our approach significantly outperforms the state-of-the-art models (by 2.73% on average).
科研通智能强力驱动
Strongly Powered by AbleSci AI