BitGNN: Unlocking the Performance Potential of Binary Graph Neural Networks on GPUs
- North Carolina State University
- BATTELLE (PACIFIC NW LAB)
Graph Neural Networks (GNNs) have shown compelling results in many graph-based learning tasks. They are, however, time-consuming. Recent work has shown a promising direction in improving GNN speed and shrinking the size — network binarization, which binarizes network values and operations. Prior work, however, mainly focused on algorithm designs, leaving it open on how to fully materialize the performance potential. This work fills the gap by proposing techniques to best map binary GNNs and their computations to fit the nature of bit manipulations, optimizations and algorithms to maximize BSpMM kernel efficiency, and solutions to other factors influencing the end-to-end time on GPUs. Results on real-world graphs show that the proposed techniques outperform state of-the-art binary GNN implementations by 21-67× with little accuracy loss.
- Research Organization:
- Pacific Northwest National Laboratory (PNNL), Richland, WA (United States)
- Sponsoring Organization:
- USDOE
- DOE Contract Number:
- AC05-76RL01830
- OSTI ID:
- 2203530
- Report Number(s):
- PNNL-SA-171811
- Country of Publication:
- United States
- Language:
- English
Similar Records
BitGNN: Unleashing the Performance Potential of Binary Graph Neural Networks on GPUs
GSplit: Scaling Graph Neural Network Training on Large Graphs via Split-Parallelism
Reducing Communication in Graph Neural Network Training
Conference
·
Wed Jun 21 00:00:00 EDT 2023
· Proceedings of the 37th International Conference on Supercomputing
·
OSTI ID:2524557
GSplit: Scaling Graph Neural Network Training on Large Graphs via Split-Parallelism
Conference
·
Thu May 01 00:00:00 EDT 2025
·
OSTI ID:3002431
Reducing Communication in Graph Neural Network Training
Conference
·
Sun Nov 01 00:00:00 EDT 2020
· SC '20: Proceedings of the International Conference for High Performance Computing, Networking, Storage and Analysis
·
OSTI ID:1647608