Back to Search Start Over

Enhancing Source Code Classification Effectiveness via Prompt Learning Incorporating Knowledge Features

Authors :
Ma, Yong
Luo, Senlin
Shang, Yu-Ming
Zhang, Yifei
Li, Zhengjun
Publication Year :
2024

Abstract

Researchers have investigated the potential of leveraging pre-trained language models, such as CodeBERT, to enhance source code-related tasks. Previous methodologies have relied on CodeBERT's '[CLS]' token as the embedding representation of input sequences for task performance, necessitating additional neural network layers to enhance feature representation, which in turn increases computational expenses. These approaches have also failed to fully leverage the comprehensive knowledge inherent within the source code and its associated text, potentially limiting classification efficacy. We propose CodeClassPrompt, a text classification technique that harnesses prompt learning to extract rich knowledge associated with input sequences from pre-trained models, thereby eliminating the need for additional layers and lowering computational costs. By applying an attention mechanism, we synthesize multi-layered knowledge into task-specific features, enhancing classification accuracy. Our comprehensive experimentation across four distinct source code-related tasks reveals that CodeClassPrompt achieves competitive performance while significantly reducing computational overhead.<br />Comment: Accepted by Scientific Reports

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2401.05544
Document Type :
Working Paper
Full Text :
https://doi.org/10.1038/s41598-024-69402-7