Tractable Variational Structures for Approximating Graphical Models

Part of Advances in Neural Information Processing Systems 11 (NIPS 1998)

Bibtex Metadata Paper

Authors

David Barber, Wim Wiegerinck

Abstract

Graphical models provide a broad probabilistic framework with ap(cid:173) plications in speech recognition (Hidden Markov Models), medical diagnosis (Belief networks) and artificial intelligence (Boltzmann Machines). However, the computing time is typically exponential in the number of nodes in the graph. Within the variational frame(cid:173) work for approximating these models, we present two classes of dis(cid:173) tributions, decimatable Boltzmann Machines and Tractable Belief Networks that go beyond the standard factorized approach. We give generalised mean-field equations for both these directed and undirected approximations. Simulation results on a small bench(cid:173) mark problem suggest using these richer approximations compares favorably against others previously reported in the literature.