Back to Search
Start Over
Structure Pretraining and Prompt Tuning for Knowledge Graph Transfer
- Source :
- Proceedings of the ACM Web Conference 2023.
- Publication Year :
- 2023
- Publisher :
- ACM, 2023.
-
Abstract
- Knowledge graphs (KG) are essential background knowledge providers in many tasks. When designing models for KG-related tasks, one of the key tasks is to devise the Knowledge Representation and Fusion (KRF) module that learns the representation of elements from KGs and fuses them with task representations. While due to the difference of KGs and perspectives to be considered during fusion across tasks, duplicate and ad hoc KRF modules design are conducted among tasks. In this paper, we propose a novel knowledge graph pretraining model KGTransformer that could serve as a uniform KRF module in diverse KG-related tasks. We pretrain KGTransformer with three self-supervised tasks with sampled sub-graphs as input. For utilization, we propose a general prompt-tuning mechanism regarding task data as a triple prompt to allow flexible interactions between task KGs and task data. We evaluate pretrained KGTransformer on three tasks, triple classification, zero-shot image classification, and question answering. KGTransformer consistently achieves better results than specifically designed task models. Through experiments, we justify that the pretrained KGTransformer could be used off the shelf as a general and effective KRF module across KG-related tasks. The code and datasets are available at https://github.com/zjukg/KGTransformer.<br />Comment: Work accepted by WWW2023
Details
- Database :
- OpenAIRE
- Journal :
- Proceedings of the ACM Web Conference 2023
- Accession number :
- edsair.doi.dedup.....e5a646cc6f698b3c124447a68b9550f6