Part of Advances in Neural Information Processing Systems 34 (NeurIPS 2021)
Aleksandr Beznosikov, Gesualdo Scutari, Alexander Rogozin, Alexander Gasnikov
We study solution methods for (strongly-)convex-(strongly)-concave Saddle-Point Problems (SPPs) over networks of two type--master/workers (thus centralized) architectures and mesh (thus decentralized) networks. The local functions at each node are assumed to be \textit{similar}, due to statistical data similarity or otherwise. We establish lower complexity bounds for a fairly general class of algorithms solving the SPP. We show that a given suboptimality ϵ>0 is achieved over master/workers networks in Ω(Δ⋅δ/μ⋅log(1/ε)) rounds of communications, where δ>0 measures the degree of similarity of the local functions, μ is their strong convexity constant, and Δ is the diameter of the network. The lower communication complexity bound over mesh networks reads Ω(1/√ρ⋅δ/μ⋅log(1/ε)), where ρ is the (normalized) eigengap of the gossip matrix used for the communication between neighbouring nodes. We then propose algorithms matching the lower bounds over either types of networks (up to log-factors). We assess the effectiveness of the proposed algorithms on a robust regression problem.