Part of Advances in Neural Information Processing Systems 20 (NIPS 2007)
Max Welling, Ian Porteous, Evgeniy Bart
A general modeling framework is proposed that unifies nonparametric-Bayesian models, topic-models and Bayesian networks. This class of infinite state Bayes nets (ISBN) can be viewed as directed networks of ‘hierarchical Dirichlet processes’ (HDPs) where the domain of the variables can be structured (e.g. words in documents or features in images). We show that collapsed Gibbs sampling can be done efficiently in these models by leveraging the structure of the Bayes net and using the forward-filtering-backward-sampling algorithm for junction trees. Existing models, such as nested-DP, Pachinko allocation, mixed membership sto- chastic block models as well as a number of new models are described as ISBNs. Two experiments have been performed to illustrate these ideas.