From Margin to Sparsity

Part of Advances in Neural Information Processing Systems 13 (NIPS 2000)

Bibtex Metadata Paper

Authors

Thore Graepel, Ralf Herbrich, Robert C. Williamson

Abstract

We present an improvement of Novikoff's perceptron convergence theorem. Reinterpreting this mistake bound as a margin dependent sparsity guarantee allows us to give a PAC-style generalisation er(cid:173) ror bound for the classifier learned by the perceptron learning algo(cid:173) rithm. The bound value crucially depends on the margin a support vector machine would achieve on the same data set using the same kernel. Ironically, the bound yields better guarantees than are cur(cid:173) rently available for the support vector solution itself.