NeurIPS 2020

Generalized Boosting


Meta Review

This paper presents a boosting variant that outperforms layer wise training of a neural network, while being quite similar conceptually. The paper is a good mix of theory and empirical evaluation, showing good results compared to both traditional boosting and layer wise training. It cannot beat end-to-end training, but needs a lot less memory, as no back propagation is needed. This makes it a valuable contribution in my opinion.