Graph Structure of Neural Networks

Part of Proceedings of the International Conference on Machine Learning 1 pre-proceedings (ICML 2020)

Bibtex »Metadata »Paper »Supplemental »

Bibtek download is not availble in the pre-proceeding


Authors

Jiaxuan You, Jure Leskovec, Kaiming He, Saining Xie

Abstract

<p>Neural networks are often represented as graphs of connections between the neurons. However, despite their wide use there is currently no understanding of the relationship between the graph structure of a neural network and its predictive performance. Here we systematically investigate this relationship, via developing a novel graph-based representation of neural networks called relational graph, where computation is specified by rounds of message exchange along the graph structure. Using our novel framework we show that (1) there is a “sweet spot”, where relational graphs within certain range of average path length and clustering coefficient lead to neural networks with significant improvements in predictive performance; (2) perhaps even more surprisingly, we find that these sweet spots tend to highly correlate across different architectures and datasets; and, (3) we show that discovering top-performing relational graphs only requires a few epochs of training. Overall, our results suggest promising avenues for designing and understanding neural networks with graph representations.</p>