Part of Advances in Neural Information Processing Systems 3 (NIPS 1990)
Patrice Simard, Jean Raysz, Bernard Victorri
Bernard Victorri ELSAP Universite de Caen 14032 Caen Cedex France
Fully recurrent (asymmetrical) networks can be thought of as dynamic systems. The dynamics can be shaped to perform content addressable memories, recognize sequences, or generate trajectories. Unfortunately several problems can arise: First, the convergence in the state space is not guaranteed. Second, the learned fixed points or trajectories are not necessarily stable. Finally, there might exist spurious fixed points and/or spurious "attracting" trajectories that do not correspond to any patterns. In this paper, we introduce a new energy function that presents solutions to all of these problems. We present an efficient gradient descent algorithm which directly acts on the stability of the fixed points and trajectories and on the size and shape of the corresponding basin and valley of attraction. The results are illustrated by the simulation of a small content addressable memory.