NeurIPS 2019
Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center
Paper ID:3769
Title:On the convergence of single-call stochastic extra-gradient methods

Reviewer 1


		
The paper is well-written and can be easily followed. I did not carefully check the proofs, but the results are reasonable and the methodology is correct. The results are mildly significant in my opinion.

Reviewer 2


		
This paper solves an open problem in the analysis of some variants of extragradient. On the positive side, I found the paper to be clear and well written. I particularly appreciated the review of single-call variants of extragradient in Section 3. I reviewed some key results that are proven in the appendix (Lemma 2), which seemed correct. On the negative side, I found their results on non-monotone operators (Theorem 4.) to be rather disappointing, since it only applies when the iterates are already in a neighborhood of the solution and the operator to be (basically) strongly monotone in that region. It is nothing more than a localized version of previous results. # Post rebuttal I read te autors rebuttal and the other reviewer's review. My score remains unchanged.

Reviewer 3


		
In this paper, the authors study the EG algorithm and its variants in the optimization of VI with single oracle. In the convergence analysis, a unified way is proposed. Then the ergodic and last-iterate convergences are discussed with monotone and non-monotone operators in both deterministic and stochastic cases. The authors mentioned their motivation clearly. Some related works were discussed in detail and the paper is well written.