e-Entropy and the Complexity of Feedforward Neural Networks

Part of Advances in Neural Information Processing Systems 3 (NIPS 1990)

Bibtex Metadata Paper

Authors

Robert C. Williamson

Abstract

We develop a. new feedforward neuralnet.work represent.ation of Lipschitz functions from [0, p]n into [0,1] ba'3ed on the level sets of the function. We show that

~~ + ~€r + ( 1 + h) (:~) n

is an upper bound on the number of nodes needed to represent f to within uniform error Cr, where L is the Lipschitz constant. \Ve also show that the number of bits needed to represent the weights in the network in order to achieve this approximation is given by

o (~2;~r (:~) n) .

\Ve compare this bound with the [-entropy of the functional class under consideration.