NeurIPS 2019
Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center
Paper ID:4604
Title:A Universally Optimal Multistage Accelerated Stochastic Gradient Method


		
This paper designs a multistage SGD algorithm that does not need to know noise and optimality gap at initialization and yet obtain optimal convergence rates. This is a well written paper with good results.