NeurIPS 2019
Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center
Paper ID:6831
Title:Beating SGD Saturation with Tail-Averaging and Minibatching


		
This paper analyses the convergence property of SGD on RKHS under the setting of multiple passes, mini-batching and tail averaging. It is shown that the tail averaging gives faster convergence than the uniform averaging. Moreover, the mini-batch technique enables more aggressive step size. The paper is well written so that the effect of each technique is concisely summarized. This paper adequately put the paper itself in the literature. Several audiences would be interested in this paper.