Using Transfer Learning for Code-Related Tasks

计算机科学 自动汇总 学习迁移 人工智能 自然语言处理 变压器 机器翻译 编码(集合论) 语言模型 任务(项目管理) 深度学习 机器学习 程序设计语言 物理 管理 集合(抽象数据类型) 量子力学 电压 经济
作者
Antonio Mastropaolo,Nathan Cooper,David N. Palacio,Simone Scalabrino,Denys Poshyvanyk,Rocco Oliveto,Gabriele Bavota
出处
期刊:IEEE Transactions on Software Engineering [Institute of Electrical and Electronics Engineers]
卷期号:49 (4): 1580-1598 被引量:10
标识
DOI:10.1109/tse.2022.3183297
摘要

Deep learning (DL) techniques have been used to support several code-related tasks such as code summarization and bug-fixing. In particular, pre-trained transformer models are on the rise, also thanks to the excellent results they achieved in Natural Language Processing (NLP) tasks. The basic idea behind these models is to first pre-train them on a generic dataset using a self-supervised task (e.g., filling masked words in sentences). Then, these models are fine-tuned to support specific tasks of interest (e.g., language translation). A single model can be fine-tuned to support multiple tasks, possibly exploiting the benefits of transfer learning . This means that knowledge acquired to solve a specific task (e.g., language translation) can be useful to boost performance on another task (e.g., sentiment classification). While the benefits of transfer learning have been widely studied in NLP, limited empirical evidence is available when it comes to code-related tasks. In this paper, we assess the performance of the Text-To-Text Transfer Transformer (T5) model in supporting four different code-related tasks: (i) automatic bug-fixing, (ii) injection of code mutants, (iii) generation of assert statements, and (iv) code summarization. We pay particular attention in studying the role played by pre-training and multi-task fine-tuning on the model's performance. We show that (i) the T5 can achieve better performance as compared to state-of-the-art baselines; and (ii) while pre-training helps the model, not all tasks benefit from a multi-task fine-tuning.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
大幅提高文件上传限制,最高150M (2024-4-1)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
1秒前
可爱多完成签到,获得积分20
2秒前
情怀应助小小米采纳,获得10
4秒前
等待的奇异果完成签到,获得积分10
4秒前
gfgfgf应助胖胖橘采纳,获得50
5秒前
8秒前
李李李发布了新的文献求助10
10秒前
abcde完成签到,获得积分10
11秒前
12秒前
小马甲应助橙c美式采纳,获得10
14秒前
15秒前
ycc发布了新的文献求助10
15秒前
国产耗材完成签到,获得积分10
15秒前
16秒前
17秒前
18秒前
共享精神应助wenxiansci采纳,获得10
19秒前
洋洋呀发布了新的文献求助10
19秒前
要减肥小刺猬完成签到,获得积分10
20秒前
FashionBoy应助wf采纳,获得10
21秒前
努力小周发布了新的文献求助10
21秒前
jwx发布了新的文献求助10
22秒前
23秒前
不倦应助清脆雪糕采纳,获得10
23秒前
慕青应助Meteor采纳,获得10
24秒前
24秒前
HhJourney发布了新的文献求助150
25秒前
龙大完成签到,获得积分10
26秒前
风轩轩发布了新的文献求助10
26秒前
26秒前
隐形曼青应助天才静采纳,获得10
26秒前
26秒前
橙c美式发布了新的文献求助10
27秒前
辛苦的矮蜗牛完成签到 ,获得积分20
27秒前
29秒前
烟花应助努力小周采纳,获得10
31秒前
33秒前
bkagyin应助橙c美式采纳,获得10
35秒前
CodeCraft应助洋洋呀采纳,获得10
36秒前
38秒前
高分求助中
Teaching Social and Emotional Learning in Physical Education 900
Plesiosaur extinction cycles; events that mark the beginning, middle and end of the Cretaceous 800
Recherches Ethnographiques sue les Yao dans la Chine du Sud 500
Two-sample Mendelian randomization analysis reveals causal relationships between blood lipids and venous thromboembolism 500
[Lambert-Eaton syndrome without calcium channel autoantibodies] 460
Wisdom, Gods and Literature Studies in Assyriology in Honour of W. G. Lambert 400
薩提亞模式團體方案對青年情侶輔導效果之研究 400
热门求助领域 (近24小时)
化学 材料科学 医学 生物 有机化学 工程类 生物化学 纳米技术 物理 内科学 计算机科学 化学工程 复合材料 遗传学 基因 物理化学 催化作用 电极 光电子学 量子力学
热门帖子
关注 科研通微信公众号,转发送积分 2394525
求助须知:如何正确求助?哪些是违规求助? 2098150
关于积分的说明 5287330
捐赠科研通 1825644
什么是DOI,文献DOI怎么找? 910236
版权声明 559972
科研通“疑难数据库(出版商)”最低求助积分说明 486501