自动汇总
计算机科学
过度拟合
任务(项目管理)
资源(消歧)
元学习(计算机科学)
人工智能
学习迁移
领域(数学分析)
机器学习
自然语言处理
人工神经网络
计算机网络
数学分析
数学
管理
经济
作者
Taehun Huh,Youngjoong Ko
标识
DOI:10.1145/3477495.3531908
摘要
Recently, supervised abstractive summarization using high-resource datasets, such as CNN/DailyMail and Xsum, has achieved significant performance improvements. However, most of the existing high-resource dataset is biased towards a specific domain like news, and annotating document-summary pairs for low-resource datasets is too expensive. Furthermore, the need for low-resource abstractive summarization task is emerging but existing methods for the task such as transfer learning still have domain shifting and overfitting problems. To address these problems, we propose a new framework for low-resource abstractive summarization using a meta-learning algorithm that can quickly adapt to a new domain using small data. For adaptive meta-learning, we introduce a lightweight module inserted into the attention mechanism of a pre-trained language model; the module is first meta-learned with high-resource task-related datasets and then is fine-tuned with the low-resource target dataset. We evaluate our model on 11 different datasets. Experimental results show that the proposed method achieves the state-of-the-art on 9 datasets in low-resource abstractive summarization.
科研通智能强力驱动
Strongly Powered by AbleSci AI