Recurrent Neural Networks Can Learn to Implement Symbol-Sensitive Counting

Part of Advances in Neural Information Processing Systems 10 (NIPS 1997)

Bibtex Metadata Paper

Authors

Paul Rodriguez, Janet Wiles

Abstract

Recently researchers have derived formal complexity analysis of analog computation in the setting of discrete-time dynamical systems. As an empirical constrast, training recurrent neural networks (RNNs) produces self -organized systems that are realizations of analog mechanisms. Pre(cid:173) vious work showed that a RNN can learn to process a simple context-free language (CFL) by counting. Herein, we extend that work to show that a RNN can learn a harder CFL, a simple palindrome, by organizing its re(cid:173) sources into a symbol-sensitive counting solution, and we provide a dy(cid:173) namical systems analysis which demonstrates how the network: can not only count, but also copy and store counting infonnation.