Back to Search Start Over

LiGNN: Graph Neural Networks at LinkedIn

Authors :
Borisyuk, Fedor
He, Shihai
Ouyang, Yunbo
Ramezani, Morteza
Du, Peng
Hou, Xiaochen
Jiang, Chengming
Pasumarthy, Nitin
Bannur, Priya
Tiwana, Birjodh
Liu, Ping
Dangi, Siddharth
Sun, Daqi
Pei, Zhoutao
Shi, Xiao
Zhu, Sirou
Shen, Qianqi
Lee, Kuang-Hsuan
Stein, David
Li, Baolei
Wei, Haichao
Ghoting, Amol
Ghosh, Souvik
Publication Year :
2024

Abstract

In this paper, we present LiGNN, a deployed large-scale Graph Neural Networks (GNNs) Framework. We share our insight on developing and deployment of GNNs at large scale at LinkedIn. We present a set of algorithmic improvements to the quality of GNN representation learning including temporal graph architectures with long term losses, effective cold start solutions via graph densification, ID embeddings and multi-hop neighbor sampling. We explain how we built and sped up by 7x our large-scale training on LinkedIn graphs with adaptive sampling of neighbors, grouping and slicing of training data batches, specialized shared-memory queue and local gradient optimization. We summarize our deployment lessons and learnings gathered from A/B test experiments. The techniques presented in this work have contributed to an approximate relative improvements of 1% of Job application hearing back rate, 2% Ads CTR lift, 0.5% of Feed engaged daily active users, 0.2% session lift and 0.1% weekly active user lift from people recommendation. We believe that this work can provide practical solutions and insights for engineers who are interested in applying Graph neural networks at large scale.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2402.11139
Document Type :
Working Paper