NeurIPS 2020

On Convergence and Generalization of Dropout Training


Meta Review

The work presents a non-asymptotic rate of convergence on test error via drop-out for 2-layer ReLU networks in the NTK regime. The reviewers appreciate the results, the presentation, and the coverage of the related work. Reviewers felt that additional discussions on the assumptions, e.g., the margin assumption, training of only hidden layer, etc., would enrich the paper.