Provable Subspace Clustering: When LRR meets SSC

Part of Advances in Neural Information Processing Systems 26 (NIPS 2013)

Bibtex Metadata Paper Reviews Supplemental

Authors

Yu-Xiang Wang, Huan Xu, Chenlei Leng

Abstract

Sparse Subspace Clustering (SSC) and Low-Rank Representation (LRR) are both considered as the state-of-the-art methods for {\em subspace clustering}. The two methods are fundamentally similar in that both are convex optimizations exploiting the intuition of Self-Expressiveness''. The main difference is that SSC minimizes the vector $\ell_1$ norm of the representation matrix to induce sparsity while LRR minimizes nuclear norm (aka trace norm) to promote a low-rank structure. Because the representation matrix is often simultaneously sparse and low-rank, we propose a new algorithm, termed Low-Rank Sparse Subspace Clustering (LRSSC), by combining SSC and LRR, and develops theoretical guarantees of when the algorithm succeeds. The results reveal interesting insights into the strength and weakness of SSC and LRR and demonstrate how LRSSC can take the advantages of both methods in preserving the "Self-Expressiveness Property'' and "Graph Connectivity'' at the same time."