Adaptive Mixture of Probabilistic Transducers

Part of Advances in Neural Information Processing Systems 8 (NIPS 1995)

Bibtex Metadata Paper


Yoram Singer


We introduce and analyze a mixture model for supervised learning of probabilistic transducers. We devise an online learning algorithm that efficiently infers the structure and estimates the parameters of each model in the mixture. Theoretical analysis and comparative simulations indicate that the learning algorithm tracks the best model from an arbitrarily large (possibly infinite) pool of models. We also present an application of the model for inducing a noun phrase recognizer.