NeurIPS 2019
Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center
Paper ID:5641
Title:Tight Sample Complexity of Learning One-hidden-layer Convolutional Neural Networks

The paper studies the parameter recovery problem in the teacher-student setting for a network with a single hidden layer under the assumption of Gaussian inputs and non-overlapping filters. It proposes a modified training algorithm, shows convergence, and derives sample complexity bounds. All the reviewers appreciated that the contribution is important and the paper is well written. The main concern raised by two reviewers is that the assumptions of Gaussian inputs and non-overlapping filters are very strong, especially considering prior work of Goel et al. has established recovery guarantees for symmetric distributions with a slightly worse sample complexity. Another concern raised by the reviewers during discussions is that the comparisons with previous work in table 1 need clarifications as the previous work of Goel et al. does provide linear convergence in the realizable setting (see Corollary 1 here: Even considering these concerns, which the authors should addressed in the final version, the reviewers still feel the paper is above the bar for acceptance.