Fast Transformation-Invariant Factor Analysis

Part of Advances in Neural Information Processing Systems 15 (NIPS 2002)

Bibtex Metadata Paper

Authors

Anitha Kannan, Nebojsa Jojic, Brendan Frey

Abstract

Dimensionality reduction techniques such as principal component analy- sis and factor analysis are used to discover a linear mapping between high dimensional data samples and points in a lower dimensional subspace. In [6], Jojic and Frey introduced mixture of transformation-invariant component analyzers (MTCA) that can account for global transforma- tions such as translations and rotations, perform clustering and learn lo- cal appearance deformations by dimensionality reduction. However, due to enormous computational requirements of the EM algorithm for learn- ing the model, O( is the dimensionality of a data sample, MTCA was not practical for most applications. In this paper, we demon- strate how fast Fourier transforms can reduce the computation to the or- . With this speedup, we show the effectiveness of MTCA der of in various applications - tracking, video textures, clustering video se- quences, object recognition, and object detection in images.