NeurIPS 2019
Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center
Paper ID:5317
Title:Efficiently avoiding saddle points with zero order methods: No gradients required


		
This paper gives a new analysis for 0th order optimization and show that they can avoid saddle points efficiently. The new algorithm is much faster than previous algorithms based on Gaussian smoothing. Comparing number of iterations instead of oracle calls is still slightly misleading.