RAPID: Zero-Shot Domain Adaptation for Code Search with Pre-Trained Models

计算机科学 域适应 零(语言学) 适应(眼睛) 领域(数学分析) 编码(集合论) 人工智能 理论计算机科学 程序设计语言 算法 机器学习 数学 心理学 神经科学 集合(抽象数据类型) 哲学 数学分析 分类器(UML) 语言学
作者
Guodong Fan,Shizhan Chen,Cuiyun Gao,Jianmao Xiao,Tao Zhang,Zhiyong Feng
出处
期刊:ACM Transactions on Software Engineering and Methodology [Association for Computing Machinery]
卷期号:33 (5): 1-35
标识
DOI:10.1145/3641542
摘要

Code search, which refers to the process of identifying the most relevant code snippets for a given natural language query, plays a crucial role in software maintenance. However, current approaches heavily rely on labeled data for training, which results in performance decreases when confronted with cross-domain scenarios including domain- or project-specific situations. This decline can be attributed to their limited ability to effectively capture the semantics associated with such scenarios. To tackle the aforementioned problem, we propose a ze R o-shot dom A in ada P tion with pre-tra I ned mo D els framework for code search named RAPID. The framework first generates synthetic data by pseudo labeling, then trains the CodeBERT with sampled synthetic data. To avoid the influence of noisy synthetic data and enhance the model performance, we propose a mixture sampling strategy to obtain hard negative samples during training. Specifically, the mixture sampling strategy considers both relevancy and diversity to select the data that are hard to be distinguished by the models. To validate the effectiveness of our approach in zero-shot settings, we conduct extensive experiments and find that RAPID outperforms the CoCoSoDa and UniXcoder model by an average of 15.7% and 10%, respectively, as measured by the MRR metric. When trained on full data, our approach results in an average improvement of 7.5% under the MRR metric using CodeBERT. We observe that as the model’s performance in zero-shot tasks improves, the impact of hard negatives diminishes. Our observation also indicates that fine-tuning CodeT5 for generating pseudo labels can enhance the performance of the code search model, and using only 100-shot samples can yield comparable results to the supervised baseline. Furthermore, we evaluate the effectiveness of RAPID in real-world code search tasks in three GitHub projects through both human and automated assessments. Our findings reveal RAPID exhibits superior performance, e.g., an average improvement of 18% under the MRR metric over the top-performing model.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
PDF的下载单位、IP信息已删除 (2025-6-4)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
刚刚
咕咚发布了新的文献求助30
1秒前
天天快乐应助耶斯耶斯采纳,获得10
1秒前
所所应助文献来来来采纳,获得10
2秒前
2秒前
乐观青寒发布了新的文献求助10
2秒前
xiaobai关注了科研通微信公众号
2秒前
2秒前
3秒前
心理学小白白白白完成签到,获得积分20
3秒前
安安发布了新的文献求助10
3秒前
xiaoli完成签到,获得积分10
3秒前
4秒前
4秒前
钮卿完成签到,获得积分10
5秒前
5秒前
Owen应助娇气的金鱼采纳,获得10
6秒前
禾下乘凉发布了新的文献求助10
6秒前
6秒前
6秒前
飞快的柚子关注了科研通微信公众号
6秒前
清秀雨竹发布了新的文献求助10
7秒前
wangshenshen发布了新的文献求助20
7秒前
北川完成签到,获得积分10
7秒前
pbj发布了新的文献求助10
8秒前
Tissue123给Tissue123的求助进行了留言
9秒前
Steven发布了新的文献求助10
9秒前
Dada应助mengloo采纳,获得30
9秒前
10秒前
虚幻凝天发布了新的文献求助10
11秒前
飞天817完成签到,获得积分10
11秒前
11秒前
11秒前
卡卡西西西完成签到,获得积分10
11秒前
13秒前
禾下乘凉发布了新的文献求助10
13秒前
安静的臻发布了新的文献求助10
14秒前
Lucas应助pbj采纳,获得10
14秒前
14秒前
15秒前
高分求助中
ФОРМИРОВАНИЕ АО "МЕЖДУНАРОДНАЯ КНИГА" КАК ВАЖНЕЙШЕЙ СИСТЕМЫ ОТЕЧЕСТВЕННОГО КНИГОРАСПРОСТРАНЕНИЯ 3000
Les Mantodea de Guyane: Insecta, Polyneoptera [The Mantids of French Guiana] 2500
Future Approaches to Electrochemical Sensing of Neurotransmitters 1000
Electron microscopy study of magnesium hydride (MgH2) for Hydrogen Storage 1000
Finite Groups: An Introduction 800
生物降解型栓塞微球市场(按产品类型、应用和最终用户)- 2030 年全球预测 500
Thermal Expansion of Solids (CINDAS Data Series on Material Properties, v. I-4) 470
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 物理 生物化学 纳米技术 计算机科学 化学工程 内科学 复合材料 物理化学 电极 遗传学 量子力学 基因 冶金 催化作用
热门帖子
关注 科研通微信公众号,转发送积分 3906331
求助须知:如何正确求助?哪些是违规求助? 3452107
关于积分的说明 10867485
捐赠科研通 3177533
什么是DOI,文献DOI怎么找? 1755484
邀请新用户注册赠送积分活动 848801
科研通“疑难数据库(出版商)”最低求助积分说明 791294