NeurIPS 2020

Online Robust Regression via SGD on the l1 loss


Meta Review

The paper concerns robust linear regression in the online setting, where the data follows a Gaussian linear model with corruptions. It is shown that the stochastic gradient descent on the absolute loss converges to the true parameter at a rate of order O(1/n). The paper received a universally positive evaluation from the reviewers, who acknowledged the novelty of the results, the theoretical justification of the proposed approach and the scalability of the algorithm. The main issue raised in the reviews is about quite restrictive assumptions on the data distribution (Gaussian linear model, and the centered data assumption).