NIPS 2016
Mon Dec 5th through Sun the 11th, 2016 at Centre Convencions Internacional Barcelona

Reviewer 1

Summary

The paper considers a stochastic version in CCA. CCA is a useful tool that, given two snapshots of different set of variables (that come, however, from the same set of subjects), we are interested in revealing the most important correlations between the variables. State of the art algorithms work mostly in an offline fashion: given all datapoints from each set of variables, the task is to solve the maximization of a quadratic bilinear form, over quadratic constraints. While the problem has a "closed-form" solution (after matrix transformations such as matrix inversions), only until recently some online/stochastic methods have been proposed. This paper considers such scenario for CCA and proposes 2 algorithms that solve the online CCA problem with global approximation guarantees. The authors claim that this is the first time such guarantees are provided for such setting. Experimental results show the superiority of the algorithms compared to state of the art.

Confidence in this Review

3-Expert (read the paper in detail, know the area, quite certain of my opinion)

Reviewer 2

Summary

This is a well written paper on the numerical resolution of canonical correlation analysis. The author combines advanced linear algebra and stochastic gradient techniques with a strong focus on finite-time complexity estimates.

Qualitative Assessment

I liked the sense of detail of the author but I think that the paper deserves a little more detail (I'd say a full paragraph) on the relation between canonical correlation analysis and machine learning applications. There are a few typos Line 15: trainng -> training Footnote 3: It can show -> One can show Line 13: the the Line 129: ||x_i||^2-smooth -> (\gamma_x + ||x_i||^2)-smooth Line 183: (34) -> (11)

Confidence in this Review

1-Less confident (might not have understood significant parts)

Reviewer 3

Summary

The paper studied stochastic optimization algorithm for canonical correlation analysis. Two algorithms are proposed inspired from previous work. The key observation is to run the intermediate least-squares problem inexactly by stochastic variance reduced gradient algorithms. Another contribution is the global linear convergence and time complexity analysis.

Qualitative Assessment

One of the key contributions of the paper is to establish the global linear convergence of the proposed algorithms. However, the contribution of the theoretical analysis (proof) is not highlighted. On line 113, the authors mentioned that "By the analysis of inexact power iterations ..", it seems to me that the authors are following some existing analysis without giving any citations. Please be specific. A similar ALS algorithms was also proposed in [6]. The authors cited this paper and gave some discussion on the difference of the time complexity. As the reviewer read the referenced paper, the authors in [6] mentioned that only requires black box access to any linear solver, thus SVRG and ASVRG are both applicable. It is recommend to discuss and compare with these results. In particular, is there any difference in terms of algorithm and the theoretical analysis? The second algorithm was built on the shift-and-invert preconditioning method for PCA. It reads to me this part is mostly straightforward extension of their method to the CCA setting. Essentially apply their method to a new matrix to compute the eigenvectors. Is there any significant difference in the analysis? Compared with [6], one shortcoming of this paper is that it only focuses on solving for the leading components, while [6] also can find top-k components. Is it a straightforward extension to finding top-k components? One novelty in this paper seems that the algorithms do not need to solve the least-squares problem exactly.

Confidence in this Review

2-Confident (read it all; understood it all reasonably well)

Reviewer 4

Summary

The authors propose new algorithms (or rather, new meta-algorithms that generates a class of algorithms) for solving the canonical correlation analysis problem. These algorithms solve CCA by transforming it into a sequence of least-squares problems that can be solved approximately. This allows them to leverage work on both power iteration and approximate solutions to least-squares problems to achieve both a rate that is better than their competitors (in terms of the condition number of the problem) and empirical results that seem to outperform their competitors.

Qualitative Assessment

This seems to be an interesting application of this sort of analysis to the particular problem of CCA. The work is explained clearly enough that I both understand what the authors are doing theoretically, and could implement any of the algorithms described in the paper, which is great for an algorithms paper. Additionally, the authors expose a design decision (which internal least-squares algorithm to use) that, as far as I'm aware based on my limited understanding of the area, has not previously been considered; the fact that the choice of approximate least-squares solver has any effect on the overall time complexity of the algorithm is non-obvious. Some questions/comments: 1) What differentiates your algorithm from that of the parallel work of Ge et al? Is your algorithm fundamentally different in some way? Is the difference only in the way in which the algorithm sets its parameters? Is the difference only in the theoretical analysis? If either of the latter two is true, then your rate seems to be off by a bit: can you explain this discrepancy? 2) Can you give some intuition in the experimental section about how fast we can run these algorithms? In order to know which one I should use, I should be able to get a sense of this. Is number of passes a good proxy for total wall clock time? 3) Are there actual problems in which $\tilde \kappa > N$? This seems like a rather unlikely condition, especially for the big problems on which we might be interested in stochastic optimization in the first place.

Confidence in this Review

2-Confident (read it all; understood it all reasonably well)

Reviewer 5

Summary

The paper studies the stochastic optimization of canonical correlation analysis whose objective is nonconvex and does not decouple over training samples. It proposes two globally convergent meta-algorithms for solving alternating least squares and shift-and-invert preconditioning. The theoretical analysis and experimental results demonstrate their superior performance.

Qualitative Assessment

The paper studies CCA problem which is very interesting. The algorithms are explained very clear, and the theorems behind the algorithms are solid and sound. The reviewer believes this is an excellent work and recommends for acceptance.

Confidence in this Review

1-Less confident (might not have understood significant parts)