NeurIPS 2020

Gradient Boosted Normalizing Flows


Meta Review

The paper describes a way to create mixtures of normalizing-flow models using gradient boosting. Combining several simple flow models is an alternative to increasing the capacity of a single model, and is worth exploring. One of the main concerns the reviewers expressed is that of limited novelty, in that the proposed method is largely an application and continuation of existing techniques. However, the reviewers agree that the paper is well written, well executed, that although the idea is incremental there are still things to be said about applying gradient boosting to flows, and that the experiments are well done. For these reason, I'm happy to recommend acceptance of the paper. I would strongly encourage the authors to take to heart the reviewers' feedback when preparing the camera-ready version, and improve upon related work as discussed in the rebuttal.