NeurIPS 2020

Complex Dynamics in Simple Neural Networks: Understanding Gradient Flow in Phase Retrieval


Meta Review

The authors in this work study the behavior of gradient flow in the so-called phase retrieval problem and postulate a phase transition in the Hessian as a function of the number of samples. The paper makes interesting contributions towards understanding non-convex optimization by studying a problem that is simple enough to allow for analytical calculations. Overall, there is a decent, well-supported agreement between theory and experiment (in particular, between the leading moments of the distribution of the threshold states as evaluated empirically and the computed moments). This paper is a valuable contribution to NeurIPS and should be accepted. Overall, however, we recommend various lines along which the paper could improve further to reach a wider audience, and we recommend that the authors revisit the author feedback before they submit their final version. First, the paper presentation is somewhat unusually difficult to follow from the perspective of the machine learning audience and could be improved by providing more background on known results that were used in the paper (e.g., the BPP transition or replica theory), if necessary in the appendix. Also, moving some mathematical results to the appendix for the sake of creating space for a non-technical discussion on the relevant background in the beginning of section 2 would make the paper more readable. Also, descriptions on the implications and meanings of the theoretical results that were used (e.g., in line 113) should be provided. Second, it would be nice if the authors could discuss the possible origins (i.e., the validity of assumptions made) of the quantitative disagreement between their theoretical prediction of the threshold state and their experiments, if possible.