计算机科学
代码段
冗余代码
程序设计语言
抽象语法树
系统代码
无法访问的代码
代码生成
编码(集合论)
树形结构
自动汇总
源代码
无效代码
人工智能
理论计算机科学
数据结构
算法
解析
编码速率
解码方法
计算机安全
集合(抽象数据类型)
钥匙(锁)
作者
Daya Guo,Shuo Ren,Shuai Lu,Zhangyin Feng,Duyu Tang,Shujie Liu,Long Zhou,Nan Duan,A. Svyatkovskiy,Sheng‐Yu Fu,Michele Tufano,Shao Kun Deng,Colin B. Clement,Dawn Drain,Neel Sundaresan,Jian Yin,Daxin Jiang,Ming Zhou
摘要
Pre-trained models for programming language have achieved dramatic empirical improvements on a variety of code-related tasks such as code search, code completion, code summarization, etc. However, existing pre-trained models regard a code snippet as a sequence of tokens, while ignoring the inherent structure of code, which provides crucial code semantics and would enhance the code understanding process. We present GraphCodeBERT, a pre-trained model for programming language that considers the inherent structure of code. Instead of taking syntactic-level structure of code like abstract syntax tree (AST), we use data flow in the pre-training stage, which is a semantic-level structure of code that encodes the relation of where-the-value-comes-from between variables. Such a semantic-level structure is neat and does not bring an unnecessarily deep hierarchy of AST, the property of which makes the model more efficient. We develop GraphCodeBERT based on Transformer. In addition to using the task of masked language modeling, we introduce two structure-aware pre-training tasks. One is to predict code structure edges, and the other is to align representations between source code and code structure. We implement the model in an efficient way with a graph-guided masked attention function to incorporate the code structure. We evaluate our model on four tasks, including code search, clone detection, code translation, and code refinement. Results show that code structure and newly introduced pre-training tasks can improve GraphCodeBERT and achieves state-of-the-art performance on the four downstream tasks. We further show that the model prefers structure-level attentions over token-level attentions in the task of code search.
科研通智能强力驱动
Strongly Powered by AbleSci AI