Learning Sequential Tasks by Incrementally Adding Higher Orders

Part of Advances in Neural Information Processing Systems 5 (NIPS 1992)

Bibtex Metadata Paper

Authors

Mark Ring

Abstract

An incremental, higher-order, non-recurrent network combines two properties found to be useful for learning sequential tasks: higher(cid:173) order connections and incremental introduction of new units. The network adds higher orders when needed by adding new units that dynamically modify connection weights. Since the new units mod(cid:173) ify the weights at the next time-step with information from the previous step, temporal tasks can be learned without the use of feedback, thereby greatly simplifying training. Furthermore, a the(cid:173) oretically unlimited number of units can be added to reach into the arbitrarily distant past. Experiments with the Reber gram(cid:173) mar have demonstrated speedups of two orders of magnitude over recurrent networks.