NeurIPS 2020
### Analytic Characterization of the Hessian in Shallow ReLU Models: A Tale of Symmetry

### Meta Review

The paper considers the simple problem of a squared-loss for a two-layer ReLU network. By relying on certain invariance properties of the loss, the authors derive an analytical expression of the Hessian and its spectrum. The analysis, in the context of the specific problem at hand, is rather novel and although it only yields results on a small shallow network, it has some potential to yield some new research directions.
Although I recommend acceptance, I should say that the paper is seen by the reviewers as quite technical and I think the paper would much benefit from a revision where more intuition (maybe a proof sketch) is given in the main paper.