Back to Search Start Over

GraphCode2Vec: Generic Code Embedding via Lexical and Program Dependence Analyses

Authors :
Ma, Wei
Zhao, Mengjie
Soremekun, Ezekiel
Hu, Qiang
Zhang, Jie
Papadakis, Mike
Cordy, Maxime
Xie, Xiaofei
Traon, Yves Le
Publication Year :
2021

Abstract

Code embedding is a keystone in the application of machine learning on several Software Engineering (SE) tasks. To effectively support a plethora of SE tasks, the embedding needs to capture program syntax and semantics in a way that is generic. To this end, we propose the first self-supervised pre-training approach (called GraphCode2Vec) which produces task-agnostic embedding of lexical and program dependence features. GraphCode2Vec achieves this via a synergistic combination of code analysis and Graph Neural Networks. GraphCode2Vec is generic, it allows pre-training, and it is applicable to several SE downstream tasks. We evaluate the effectiveness of GraphCode2Vec on four (4) tasks (method name prediction, solution classification, mutation testing and overfitted patch classification), and compare it with four (4) similarly generic code embedding baselines (Code2Seq, Code2Vec, CodeBERT, GraphCodeBERT) and 7 task-specific, learning-based methods. In particular, GraphCode2Vec is more effective than both generic and task-specific learning-based baselines. It is also complementary and comparable to GraphCodeBERT (a larger and more complex model). We also demonstrate through a probing and ablation study that GraphCode2Vec learns lexical and program dependence features and that self-supervised pre-training improves effectiveness.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2112.01218
Document Type :
Working Paper