Propagation Algorithms for Variational Bayesian Learning

Part of Advances in Neural Information Processing Systems 13 (NIPS 2000)

Bibtex Metadata Paper

Authors

Zoubin Ghahramani, Matthew Beal

Abstract

Variational approximations are becoming a widespread tool for Bayesian learning of graphical models. We provide some theoret(cid:173) ical results for the variational updates in a very general family of conjugate-exponential graphical models. We show how the belief propagation and the junction tree algorithms can be used in the inference step of variational Bayesian learning. Applying these re(cid:173) sults to the Bayesian analysis of linear-Gaussian state-space models we obtain a learning procedure that exploits the Kalman smooth(cid:173) ing propagation, while integrating over all model parameters. We demonstrate how this can be used to infer the hidden state dimen(cid:173) sionality of the state-space model in a variety of synthetic problems and one real high-dimensional data set.