NeurIPS 2020

Convergence and Stability of Graph Convolutional Networks on Large Random Graphs


Meta Review

This paper considers a continuous version of graph convolutional neural network and analyze the usual discrete GCN as a discrete approximation of the continuous one. Under some random graph generative models, the convergence rate of the discrete one to the continuous one is derived. Moreover, some stability results are given to show that the induced GCN is stable against perturbation of the underlying generative model. The analysis is interesting and the expositions are well written. This kind of continuous-to-discrete type analysis would facilitate further theoretical analysis to understand GCN in general. Therefore, this paper is worth publication in NeurIPS. I encourage the authors to include the numerical experiments given in the feedback at least in the supplementary material.