Part of Proceedings of Machine Learning and Systems 1 (MLSys 2019)
Adam Lerer, Ledell Wu, Jiajun Shen, Timothee Lacroix, Luca Wehrstedt, Abhijit Bose, Alex Peysakhovich
Graph embedding methods produce unsupervised node features from graphs that can then be used for a variety of machine learning tasks. However, modern graph datasets contain billions of nodes and trillions of edges, which exceeds the capability of existing embedding systems. We present Pytorch-BigGraph (PBG), an embedding system that incorporates several modifications to traditional multi-relation embedding systems that allow it to scale to graphs with billions of nodes and trillions of edges. PBG uses graph partitioning to train arbitrarily large embeddings on either a single machine or in a distributed environment. We evaluate demonstrate comparable performance with existing embedding systems on common benchmarks, while allowing for scaling to arbitrarily large graphs and parallelization on multiple machines. We train and evaluate embeddings on several large social network graphs and on the full Freebase dataset, which contains over 100 million nodes and 2 billion edges.