Skip to content

Oceanusity/awesome-gnns-on-large-scale-graphs

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

31 Commits
 
 

Repository files navigation

Awesome gnns on large-scale graphs

Papers about methods or graph neural networks (GNNs) on large-scale graphs. Aiming to solve the memory bottleneck problem of GNNs on large-scale graphs, many training strategies such as node-wise, layer-wise, and subgraph sampling are widely explored. In addition, there are also some works designing specific GNNs to solve this problem.

Welcome to submit a pull request to add more awesome papers.

  • [Survey] A Survey on Graph Neural Network Acceleration: Algorithms, Systems, and Customized Hardware. [paper]
  • [Tutorial] Large-Scale Graph Neural Networks: The Past and New Frontiers. [homepage]

2023


  • [ICML 2023] LazyGNN: Large-Scale Graph Neural Networks via Lazy Propagation. [paper]
  • [ICLR 2023] MLPInit: Embarrassingly Simple GNN Training Acceleration with MLP Initialization. [paper]
  • [ICLR 2023] LMC: Fast Training of GNNs via Subgraph Sampling with Provable Convergence. [openreview]

2022


  • [ICML 2022] GraphFM: Improving Large-Scale GNN Training via Feature Momentum. [paper]
  • [ICML 2022] Generalization Guarantee of Training Graph Convolutional Networks with Graph Topology Sampling. [paper]
  • [ICLR 2022] PipeGCN: Efficient Full-Graph Training of Graph Convolutional Networks with Pipelined Feature Communication. [paper] [code]
  • [ICLR 2022] EXACT: Scalable Graph Neural Networks Training via Extreme Activation Compression. [paper] [code]
  • [VLDB 2022] SANCUS: Staleness-Aware Communication-Avoiding Full-Graph Decentralized Training in Large-Scale Graph Neural Networks. [paper] [code]
  • [NeruIPS 2022 Datasets and Benchmarks Track] A Comprehensive Study on Large-Scale Graph Training: Benchmarking and Rethinking. [paper] [code]

2021


  • [NeurIPS 2021] Decoupling the Depth and Scope of Graph Neural Networks. [paper] [code]
  • [NeurIPS 2021] VQ-GNN: A Universal Framework to Scale up Graph Neural Networks using Vector Quantization. [paper] [code]
  • [ICLR 2021] Combining Label Propagation and Simple Models Out-performs Graph Neural Networks. [paper] [code]
  • [KDD 2021] Scaling Up Graph Neural Networks Via Graph Coarsening. [paper] [code]
  • [ICML 2021] GNNAutoScale: Scalable and Expressive Graph Neural Networks via Historical Embeddings. [paper] [code]

2020


  • [ICLR 2020] GraphSAINT: Graph Sampling Based Inductive Learning Method. [paper] [code]
  • [KDD 2020] Minimal Variance Sampling with Provable Guarantees for Fast Training of Graph Neural Networks. [paper] [code]
  • [KDD 2020] Scaling Graph Neural Networks with Approximate PageRank. [paper] [TensorFlow] [PyTorch] [web]
  • [ICML Workshop 2020] SIGN: Scalable Inception Graph Networks. [paper] [code]
  • [ICML 2020] Simple and Deep Graph Convolutional Networks. [paper] [code]
  • [NeurIPS 2020] Scalable Graph Neural Networks via Bidirectional Propagation. [paper] [code]

2019


  • [ICLR 2019] Predict then Propagate: Graph Neural Networks meet Personalized PageRank. [paper] [code]
  • [KDD 2019] Cluster-GCN: An Efficient Algorithm for Training Deep and Large Graph Convolutional Networks. [paper] [TensorFlow] [PyTorch]
  • [ICML 2019] Simplifying Graph Convolution Networks. [paper] [code]
  • [NeurIPS 2019] Layer-Dependent Importance Sampling for Training Deep and Large Graph Convolutional Networks. [paper] [code]

2018


  • [ICLR 2018] FastGCN: Fast Learning with Graph Convolutional Networks via Importance Sampling. [paper] [code]
  • [KDD 2018] Large-Scale Learnable Graph Convolutional Networks. [paper] [code]
  • [ICML 2018] Stochastic Training of Graph Convolutional Networks with Variance Reduction. [paper] [code]
  • [NeurIPS 2018] Adaptive Sampling Towards Fast Graph Representation Learning. [paper] [code]

2017


  • [NIPS 2017] Inductive Representation Learning on Large Graphs. [paper] [code]

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •