Metamorphosis Networks: An Alternative to Constructive Models

Part of Advances in Neural Information Processing Systems 5 (NIPS 1992)

Bibtex Metadata Paper

Authors

Brian Bonnlander, Michael C. Mozer

Abstract

Given a set oft raining examples, determining the appropriate num(cid:173) ber of free parameters is a challenging problem. Constructive learning algorithms attempt to solve this problem automatically by adding hidden units, and therefore free parameters, during learn(cid:173) ing. We explore an alternative class of algorithms-called meta(cid:173) morphosis algorithms-in which the number of units is fixed, but the number of free parameters gradually increases during learning. The architecture we investigate is composed of RBF units on a lat(cid:173) tice, which imposes flexible constraints on the parameters of the network. Virtues of this approach include variable subset selec(cid:173) tion, robust parameter selection, multiresolution processing, and interpolation of sparse training data.