NeurIPS 2019
Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center
Paper ID:850
Title:HyperGCN: A New Method For Training Graph Convolutional Networks on Hypergraphs


		
This submission provides a technique to more faithfully generalize graph convolutional networks to handle hypergraph data. The paper is original and makes nice connections to spectral hypergraph theory and there is demonstrated performance improvement over graph-based hypergraph modeling techniques like clique expansion. For these reasons, I am recommending to accept this paper. There is one specific issues that should be addressed in a camera version. Please add more discussion on why FastHyperGCN can be better than HyperGCN in practice (Table 5). The fast version is omitting information that would presumably be useful. The presentation here should be clarified. Is this just due to variance in the experiments? Additional clarification on the performance of 1-HyperGCN compared to FastHyperGCN would also be useful, as the reviewers still did not completely understand the author feedback on that point.