Adjoint Operator Algorithms for Faster Learning in Dynamical Neural Networks

Part of Advances in Neural Information Processing Systems 2 (NIPS 1989)

Bibtex Metadata Paper

Authors

Jacob Barhen, Nikzad Toomarian, Sandeep Gulati

Abstract

A methodology for faster supervised learning in dynamical nonlin(cid:173) ear neural networks is presented. It exploits the concept of adjoint operntors to enable computation of changes in the network's re(cid:173) sponse due to perturbations in all system parameters, using the so(cid:173) lution of a single set of appropriately constructed linear equations. The lower bound on speedup per learning iteration over conven(cid:173) tional methods for calculating the neuromorphic energy gradient is O(N2), where N is the number of neurons in the network.