Back to Search Start Over

Can we Soft Prompt LLMs for Graph Learning Tasks?

Authors :
Liu, Zheyuan
He, Xiaoxin
Tian, Yijun
Chawla, Nitesh V.
Publication Year :
2024

Abstract

Graph plays an important role in representing complex relationships in real-world applications such as social networks, biological data and citation networks. In recent years, Large Language Models (LLMs) have achieved tremendous success in various domains, which makes applying LLMs to graphs particularly appealing. However, directly applying LLMs to graph modalities presents unique challenges due to the discrepancy and mismatch between the graph and text modalities. Hence, to further investigate LLMs' potential for comprehending graph information, we introduce GraphPrompter, a novel framework designed to align graph information with LLMs via soft prompts. Specifically, GraphPrompter consists of two main components: a graph neural network to encode complex graph information and an LLM that effectively processes textual information. Comprehensive experiments on various benchmark datasets under node classification and link prediction tasks demonstrate the effectiveness of our proposed method. The GraphPrompter framework unveils the substantial capabilities of LLMs as predictors in graph-related tasks, enabling researchers to utilize LLMs across a spectrum of real-world graph scenarios more effectively.<br />Comment: Accepted by The Web Conference (WWW) 2024 Short Paper Track

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2402.10359
Document Type :
Working Paper
Full Text :
https://doi.org/10.1145/3589335.3651476