Part of Advances in Neural Information Processing Systems 6 (NIPS 1993)
Laurens Leerink, Marwan Jabri
We present an algorithm for the training of feedforward and recur(cid:173) rent neural networks. It detects internal representation conflicts and uses these conflicts in a constructive manner to add new neu(cid:173) rons to the network . The advantages are twofold: (1) starting with a small network neurons are only allocated when required; (2) by detecting and resolving internal conflicts at an early stage learning time is reduced. Empirical results on two real-world problems sub(cid:173) stantiate the faster learning speed; when applied to the training of a recurrent network on a well researched sequence recognition task (the Reber grammar), training times are significantly less than previously reported .