NeurIPS 2019
Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center
Paper ID:7549
Title:Non-normal Recurrent Neural Network (nnRNN): learning long time dependencies while improving expressivity with transient dynamics


		
The paper contributes in a significant way to show how RNN able to deal with vanishing/exploding gradients can be obtained not only by enforcing recurrent weight matrices to be normal or orthogonal, but also by exploiting a larger set of matrices obtained by Schur matrix decomposition. Although the proposed contribution cannot be readily extended to gated units and the computational effort is increased because of the Schur matrix decomposition, the paper provides new insights that can be the basis for improved algorithms. The rebuttal helped to solve most of the issues raised by the reviewers.