Part of Advances in Neural Information Processing Systems 3 (NIPS 1990)
Yann LeCun, Ido Kanter, Sara Solla
The learning time of a simple neural network model is obtained through an analytic computation of the eigenvalue spectrum for the Hessian matrix, which describes the second order properties of the cost function in the space of coupling coefficients. The form of the eigenvalue distribution suggests new techniques for accelerating the learning process, and provides a theoretical justification for the choice of centered versus biased state variables.