计算机科学
自然语言处理
杠杆(统计)
判决
人工智能
实体链接
安全性令牌
一般化
命名实体识别
背景(考古学)
知识库
任务(项目管理)
数学分析
经济
生物
古生物学
计算机安全
数学
管理
作者
Kai He,Rui Mao,Yucheng Huang,Tieliang Gong,Chen Li,Erik Cambria
标识
DOI:10.1109/tnnls.2023.3314807
摘要
Prompt tuning has achieved great success in various sentence-level classification tasks by using elaborated label word mappings and prompt templates. However, for solving token-level classification tasks, e.g., named entity recognition (NER), previous research, which utilizes N-gram traversal for prompting all spans with all possible entity types, is time-consuming. To this end, we propose a novel prompt-based contrastive learning method for few-shot NER without template construction and label word mappings. First, we leverage external knowledge to initialize semantic anchors for each entity type. These anchors are simply appended with input sentence embeddings as template-free prompts (TFPs). Then, the prompts and sentence embeddings are in-context optimized with our proposed semantic-enhanced contrastive loss. Our proposed loss function enables contrastive learning in few-shot scenarios without requiring a significant number of negative samples. Moreover, it effectively addresses the issue of conventional contrastive learning, where negative instances with similar semantics are erroneously pushed apart in natural language processing (NLP)-related tasks. We examine our method in label extension (LE), domain-adaption (DA), and low-resource generalization evaluation tasks with six public datasets and different settings, achieving state-of-the-art (SOTA) results in most cases.
科研通智能强力驱动
Strongly Powered by AbleSci AI