计算机科学
人工智能
边距(机器学习)
任务(项目管理)
集合(抽象数据类型)
自然语言处理
推论
监督学习
机器学习
自然语言理解
自然语言
人工神经网络
经济
管理
程序设计语言
作者
Timo Schick,Hinrich Schütze
标识
DOI:10.18653/v1/2021.eacl-main.20
摘要
Some NLP tasks can be solved in a fully unsupervised fashion by providing a pretrained language model with "task descriptions" in natural language (e.g., Radford et al., 2019). While this approach underperforms its supervised counterpart, we show in this work that the two ideas can be combined: We introduce Pattern-Exploiting Training (PET), a semi-supervised training procedure that reformulates input examples as cloze-style phrases to help language models understand a given task. These phrases are then used to assign soft labels to a large set of unlabeled examples. Finally, standard supervised training is performed on the resulting training set. For several tasks and languages, PET outperforms supervised training and strong semi-supervised approaches in low-resource settings by a large margin.
科研通智能强力驱动
Strongly Powered by AbleSci AI