Lazy Learning Meets the Recursive Least Squares Algorithm

Part of Advances in Neural Information Processing Systems 11 (NIPS 1998)

Bibtex Metadata Paper

Authors

Mauro Birattari, Gianluca Bontempi, Hugues Bersini

Abstract

Lazy learning is a memory-based technique that, once a query is re(cid:173) ceived, extracts a prediction interpolating locally the neighboring exam(cid:173) ples of the query which are considered relevant according to a distance measure. In this paper we propose a data-driven method to select on a query-by-query basis the optimal number of neighbors to be considered for each prediction. As an efficient way to identify and validate local models, the recursive least squares algorithm is introduced in the con(cid:173) text of local approximation and lazy learning. Furthermore, beside the winner-takes-all strategy for model selection, a local combination of the most promising models is explored. The method proposed is tested on six different datasets and compared with a state-of-the-art approach.