课程
适应(眼睛)
计算机科学
域适应
领域(数学分析)
资源(消歧)
神经适应
机器翻译
人工智能
心理学
认知科学
神经科学
教育学
数学分析
分类器(UML)
数学
计算机网络
作者
Keyu Chen,Di Zhuang,Mingchen Li,J. Morris Chang
出处
期刊:IEEE transactions on artificial intelligence
[Institute of Electrical and Electronics Engineers]
日期:2024-01-01
卷期号:: 1-14
被引量:1
标识
DOI:10.1109/tai.2024.3396125
摘要
Neural Machine Translation (NMT) models have achieved comparable results to human translation with a large number of parallel corpora available. However, their performance remains poor when translating on new domains with a limited number of data. Recent studies either only show the model's robustness to domain shift or the superiority in adapting to new domains with a limited number of data. A solution for addressing both the model's robustness and adaptability is underexplored. In this paper, we present a novel approach Epi-Curriculum to address low-resource domain adaptation (DA), which contains a new episodic training framework along with a denoised curriculum learning. Our episodic training framework enhances the model's robustness to domain shift by episodically exposing the encoder/decoder to an inexperienced decoder/encoder. The denoised curriculum learning filters the noised data and further improves the model's adaptability by gradually guiding the learning process from easy to more difficult tasks. Extensive experiments have been conducted on English-German (En-De), English-Romanian (En-Ro), and English-French (En-Fr) translation tasks. Our results show that: (i) Epi-Curriculum outperforms the baseline on unseen and seen domains by 2.28 and 3.64 BLEU score on En-De task, and 3.32 and 2.23 on En-Ro task; (ii) Our episodic training framework outperforms the recent popular meta-learning framework in terms of robustness to domain shift and achieves comparable adaptability to new domains.
科研通智能强力驱动
Strongly Powered by AbleSci AI