计算机科学
转化(遗传学)
传输(计算)
自然语言处理
自然(考古学)
人工智能
历史
并行计算
生物化学
考古
基因
化学
作者
Shu Zhao,Shiji Yang,Shicheng Tan,Zhen Yang,Chenxuan Mei,Zhen Duan,Yanping Zhang,Jie Chen
摘要
Abstract Prompt transfer is a transfer learning method based on prompt tuning, which enhances the parameter performance of prompts in target tasks by transferring source prompt embeddings. Among existing methods, weighted aggregation is effective and possesses the advantages of being lightweight and modular. However, these methods may transfer redundant or irrelevant information from the source prompts to the target prompt, leading to negative impacts. To alleviate this problem, we propose Prompt Contrastive Transformation (PCT), which achieves efficient prompt transfer through prompt contrastive transformation and attentional fusion. PCT transforms the source prompt into task-agnostic embedding and task-specific embeddings through singular value decomposition and contrastive learning, reducing information redundancy among source prompts. The attention module in PCT selects more effective task-specific embeddings and fuses them with task-agnostic embedding into the target prompt. Experimental results show that, despite tuning only 0.035% of task-specific parameters, PCT achieves improvements in prompt transfer for single target task adaptation across various NLP tasks.
科研通智能强力驱动
Strongly Powered by AbleSci AI