NeurIPS 2020

Improving Neural Network Training in Low Dimensional Random Bases


Meta Review

The reviewers were consistent in their appreciation of the paper, as the paper demonstrated clear improvements over the ICLR work [23], and the inconsistencies that originally worried some reviewers were clarified by the rebuttal. Drawing a new subspace every iteration appears to be novel for the neural net application, though the authors point out connections with ES (and the AC notes connections to DFO community literature as well, e.g., https://arxiv.org/abs/2003.02684 and https://arxiv.org/abs/1905.01332). The reviewers also liked the compartmentalization idea. To summarize, though the initial reviewers response was only mildly positive, after the rebuttal and our discussions, the reviewers think this paper empirically shows a significant improvement over prior work.