NeurIPS 2020

Lower Bounds and Optimal Algorithms for Personalized Federated Learning


Meta Review

The submission studies lower bounds on convergence rates for federated learning, including for accelerated proximal gradient. The lower bounds established consider the number of oracle calls and the communication complexity. Also it is proven that there exists a method matching the lower bound. The rates are new and the paper is well-written. Interesting avenues for further research are a dependency on the number of clients. For a final version of the paper, besides the clarifications pointed out by reviewers discussing the range of parameters $\lambda$ and $\mu$ needs to be addressed, a discussion on how the number of clients affects convergence (i.e. in which constants the of the complexity estimates the nr. of clients has an influence) would further improve the paper.