NeurIPS 2020

Optimizing Neural Networks via Koopman Operator Theory


Meta Review

This paper provides a new perspective on neural network training based on Koopman operator theory (KOT). The paper received mixed reviews (top 50% -> marginally above, reject, marginally above -> accept, marginally below). On the positive side, and despite KOT being very old, the new perspective has a lot of potential: since the KOT is linear, if one can find (or approximate) eigenfunctions, one could compute and analyze training dynamics more easily and make optimization more efficient. On the negative side, the paper is a first step, and needs further development and experimental evaluation to demonstrate the value. Some reviewers also expressed the paper lacks clarity. The rebuttal agrees clarity can be improved and provided a cleared description of the algorithm, which should be incorporated. IMHO, the potential for impact of applying old, but powerful tools from dynamical systems to the analysis of the leaning dynamics of deep networks outweighs the weaknesses.