Learning with Noise and Regularizers in Multilayer Neural Networks

Part of Advances in Neural Information Processing Systems 9 (NIPS 1996)

Bibtex Metadata Paper

Authors

David Saad, Sara Solla

Abstract

Sara A. Solla

AT &T Research Labs

Holmdel, NJ 07733, USA solla@research .at t .com

We study the effect of noise and regularization in an on-line gradient-descent learning scenario for a general two-layer student network with an arbitrary number of hidden units. Training ex(cid:173) amples are randomly drawn input vectors labeled by a two-layer teacher network with an arbitrary number of hidden units; the ex(cid:173) amples are corrupted by Gaussian noise affecting either the output or the model itself. We examine the effect of both types of noise and that of weight-decay regularization on the dynamical evolu(cid:173) tion of the order parameters and the generalization error in various phases of the learning process.