NeurIPS 2019
Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center
Paper ID:1776
Title:Backpropagation-Friendly Eigendecomposition


		
There is mixed feedback in the reviews on the significance of the paper. In my view it is a fundamental building block in deep learning to be able to include modules that perform eigendecompositions in a stable and differentiable (via backprop) manner. While the idea of the paper is simple, i.e. combine the advantages of SVD and power iterations by using them in the forward and backward pass, respectively, there is real insight (see stability bounds) and solid empirical evidence (100% stability even in relatively high dimensions). I think this clarity - simple, but important idea, well analyzed - is a feature and makes this paper a valuable contribution.