NeurIPS 2020

Training Generative Adversarial Networks by Solving Ordinary Differential Equations

Meta Review

The paper introduces a new perspective for explaining instability in GANs training by analyzing the continuous dynamics of the training algorithm. They first show that these dynamics should converge in the vicinity of the Nash equilibrium and then make the hypothesis that instability is due to the discretization of this dynamics. They show that using higher order ODE time integrators for solving the dynamics help stabilizing the training. The paper is clear, the reviewers agree that this brings a new perspective for analyzing and training GANs and that this is a significant contribution to this topic. The theoretical findings are backed up by a nice empirical evaluation and analysis. Recommendation: accept.