Loading Now

Summary of Lignn: Graph Neural Networks at Linkedin, by Fedor Borisyuk et al.


LiGNN: Graph Neural Networks at LinkedIn

by Fedor Borisyuk, Shihai He, Yunbo Ouyang, Morteza Ramezani, Peng Du, Xiaochen Hou, Chengming Jiang, Nitin Pasumarthy, Priya Bannur, Birjodh Tiwana, Ping Liu, Siddharth Dangi, Daqi Sun, Zhoutao Pei, Xiao Shi, Sirou Zhu, Qianqi Shen, Kuang-Hsuan Lee, David Stein, Baolei Li, Haichao Wei, Amol Ghoting, Souvik Ghosh

First submitted to arxiv on: 17 Feb 2024

Categories

  • Main: Machine Learning (cs.LG)
  • Secondary: Artificial Intelligence (cs.AI)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
This paper introduces LiGNN, a deployed large-scale Graph Neural Networks (GNNs) framework developed by LinkedIn. The authors share their experience in developing and deploying GNNs at scale, highlighting algorithmic improvements to enhance the quality of GNN representation learning. These advancements include temporal graph architectures with long-term losses, cold start solutions via graph densification, ID embeddings, and multi-hop neighbor sampling. To speed up large-scale training on LinkedIn graphs, the authors employed adaptive sampling of neighbors, grouping and slicing of training data batches, a specialized shared-memory queue, and local gradient optimization. The deployment lessons learned from A/B test experiments are also presented. The techniques demonstrated in this work led to significant improvements, including a 1% increase in Job application hearing back rates, 2% lift in Ads CTR, 0.5% growth in Feed engaged daily active users, 0.2% session lift, and 0.1% weekly active user lift from people recommendation. This work offers practical solutions and insights for engineers seeking to apply Graph neural networks at large scale.
Low GrooveSquid.com (original content) Low Difficulty Summary
This paper is about a special kind of computer program called LiGNN that helps analyze relationships between things on the internet. The authors, who work at LinkedIn, share their experience in building and using this program to make it better. They explain how they improved the program’s ability to understand these relationships by adding new features like timing information and ways to handle missing data. To make the program run faster, they used clever techniques like grouping similar data together and optimizing how the computer does calculations. The authors also share what they learned from testing different versions of the program. The improvements they made led to significant benefits, such as more people responding to job applications, more people engaging with ads, and more users interacting with their social media feed.

Keywords

* Artificial intelligence  * Gnn  * Optimization  * Representation learning