清晨好,您是今天最早来到科研通的研友!由于当前在线用户较少,发布求助请尽量完整地填写文献信息,科研通机器人24小时在线,伴您科研之路漫漫前行!

RAPID: Zero-Shot Domain Adaptation for Code Search with Pre-Trained Models

计算机科学 域适应 零(语言学) 适应(眼睛) 领域(数学分析) 编码(集合论) 人工智能 理论计算机科学 程序设计语言 算法 机器学习 数学 心理学 神经科学 集合(抽象数据类型) 哲学 数学分析 分类器(UML) 语言学
作者
Guodong Fan,Shizhan Chen,Cuiyun Gao,Jianmao Xiao,Tao Zhang,Zhiyong Feng
出处
期刊:ACM Transactions on Software Engineering and Methodology [Association for Computing Machinery]
卷期号:33 (5): 1-35
标识
DOI:10.1145/3641542
摘要

Code search, which refers to the process of identifying the most relevant code snippets for a given natural language query, plays a crucial role in software maintenance. However, current approaches heavily rely on labeled data for training, which results in performance decreases when confronted with cross-domain scenarios including domain- or project-specific situations. This decline can be attributed to their limited ability to effectively capture the semantics associated with such scenarios. To tackle the aforementioned problem, we propose a ze R o-shot dom A in ada P tion with pre-tra I ned mo D els framework for code search named RAPID. The framework first generates synthetic data by pseudo labeling, then trains the CodeBERT with sampled synthetic data. To avoid the influence of noisy synthetic data and enhance the model performance, we propose a mixture sampling strategy to obtain hard negative samples during training. Specifically, the mixture sampling strategy considers both relevancy and diversity to select the data that are hard to be distinguished by the models. To validate the effectiveness of our approach in zero-shot settings, we conduct extensive experiments and find that RAPID outperforms the CoCoSoDa and UniXcoder model by an average of 15.7% and 10%, respectively, as measured by the MRR metric. When trained on full data, our approach results in an average improvement of 7.5% under the MRR metric using CodeBERT. We observe that as the model’s performance in zero-shot tasks improves, the impact of hard negatives diminishes. Our observation also indicates that fine-tuning CodeT5 for generating pseudo labels can enhance the performance of the code search model, and using only 100-shot samples can yield comparable results to the supervised baseline. Furthermore, we evaluate the effectiveness of RAPID in real-world code search tasks in three GitHub projects through both human and automated assessments. Our findings reveal RAPID exhibits superior performance, e.g., an average improvement of 18% under the MRR metric over the top-performing model.

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
丝丢皮的完成签到 ,获得积分10
1秒前
牛黄完成签到 ,获得积分10
8秒前
zxdzaz完成签到 ,获得积分10
9秒前
笨笨完成签到 ,获得积分10
14秒前
Upupgrowth完成签到 ,获得积分10
24秒前
cjl完成签到 ,获得积分10
24秒前
26秒前
33秒前
tlh完成签到 ,获得积分10
35秒前
41秒前
yindi1991完成签到 ,获得积分10
42秒前
47秒前
象象完成签到 ,获得积分10
52秒前
LJ_2完成签到 ,获得积分0
52秒前
mrx96完成签到 ,获得积分10
1分钟前
9527应助科研通管家采纳,获得10
1分钟前
9527应助科研通管家采纳,获得10
1分钟前
破茧完成签到 ,获得积分10
1分钟前
笑傲完成签到,获得积分10
1分钟前
amy完成签到 ,获得积分10
1分钟前
LFZ完成签到 ,获得积分10
1分钟前
shanshan完成签到,获得积分10
1分钟前
Wang完成签到 ,获得积分20
2分钟前
daomaihu完成签到,获得积分10
2分钟前
cq_2完成签到,获得积分0
2分钟前
柯彦完成签到 ,获得积分10
2分钟前
wmz完成签到 ,获得积分10
2分钟前
likexin完成签到,获得积分10
2分钟前
嘉心糖应助daomaihu采纳,获得100
2分钟前
嘉心糖应助daomaihu采纳,获得100
2分钟前
嘉心糖应助daomaihu采纳,获得100
2分钟前
风中星月完成签到 ,获得积分10
2分钟前
AmyHu完成签到,获得积分10
2分钟前
chenting完成签到 ,获得积分10
2分钟前
叶远望完成签到 ,获得积分10
2分钟前
lili完成签到 ,获得积分10
2分钟前
qiongqiong完成签到 ,获得积分10
2分钟前
酷酷的面包完成签到 ,获得积分10
2分钟前
迷人的焦完成签到 ,获得积分10
2分钟前
6加x完成签到 ,获得积分10
2分钟前
高分求助中
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
Applied Min-Max Approach to Missile Guidance and Control 5000
Metallurgy at high pressures and high temperatures 2000
Inorganic Chemistry Eighth Edition 1200
High Pressures-Temperatures Apparatus 1000
Free parameter models in liquid scintillation counting 1000
Standards for Molecular Testing for Red Cell, Platelet, and Neutrophil Antigens, 7th edition 1000
热门求助领域 (近24小时)
化学 材料科学 医学 生物 纳米技术 工程类 有机化学 化学工程 生物化学 计算机科学 物理 内科学 复合材料 催化作用 物理化学 光电子学 电极 细胞生物学 基因 无机化学
热门帖子
关注 科研通微信公众号,转发送积分 6320787
求助须知:如何正确求助?哪些是违规求助? 8137004
关于积分的说明 17057523
捐赠科研通 5374441
什么是DOI,文献DOI怎么找? 2852910
邀请新用户注册赠送积分活动 1830618
关于科研通互助平台的介绍 1682122