The Power of Selective Memory: Self-Bounded Learning of Prediction Suffix Trees

Part of Advances in Neural Information Processing Systems 17 (NIPS 2004)

Bibtex Metadata Paper

Authors

Ofer Dekel, Shai Shalev-shwartz, Yoram Singer

Abstract

Prediction suffix trees (PST) provide a popular and effective tool for tasks such as compression, classification, and language modeling. In this pa- per we take a decision theoretic view of PSTs for the task of sequence prediction. Generalizing the notion of margin to PSTs, we present an on- line PST learning algorithm and derive a loss bound for it. The depth of the PST generated by this algorithm scales linearly with the length of the input. We then describe a self-bounded enhancement of our learning al- gorithm which automatically grows a bounded-depth PST. We also prove an analogous mistake-bound for the self-bounded algorithm. The result is an efficient algorithm that neither relies on a-priori assumptions on the shape or maximal depth of the target PST nor does it require any param- eters. To our knowledge, this is the first provably-correct PST learning algorithm which generates a bounded-depth PST while being competi- tive with any fixed PST determined in hindsight.