Convex Optimization over Networks

A Distributed Cubic-Regularized Newton Method for Smooth Convex Optimization over Networks

Research by Cesar Uribe, PhD

We propose a distributed, cubic-regularized Newton method for large-scale convex optimization over networks. The proposed method requires only local computations and communications and is suitable for federated learning applications over arbitrary network topologies. We show a O(k3) convergence rate when the cost function is convex with Lipschitz gradient and Hessian, with k being the number of iterations. We further provide network-dependent bounds for the communication required in each step of the algorithm. We provide numerical experiments that validate our theoretical results.

Link to paper: https://arxiv.org/abs/2007.03562