A Communication-Efficient Decentralized Newton’s Method with Provably Faster Convergence
Publication in refereed journal

Altmetrics Information
.

Other information
AbstractIn this article, we consider a strongly convex finite-sum minimization problem over a decentralized network and propose a communication-efficient decentralized Newton's method for solving it. The main challenges in designing such an algorithm come from three aspects: (i) mismatch between local gradients/Hessians and the global ones; (ii) cost of sharing second-order information; (iii) tradeoff among computation and communication. To handle these challenges, we first apply dynamic average consensus (DAC) so that each node is able to use a local gradient approximation and a local Hessian approximation to track the global gradient and Hessian, respectively. Second, since exchanging Hessian approximations is far from communication-efficient, we require the nodes to exchange the compressed ones instead and then apply an error compensation mechanism to correct for the compression noise. Third, we introduce multi-step consensus for exchanging local variables and local gradient approximations to balance between computation and communication. With novel analysis, we establish the globally linear (resp., asymptotically super-linear) convergence rate of the proposed algorithm when m is constant (resp., tends to infinity), where m≥1 is the number of consensus inner steps. To the best of our knowledge, this is the first super-linear convergence result for a communication-efficient decentralized Newton's method. Moreover, the rate we establish is provably faster than those of first-order methods. Our numerical results on various applications corroborate the theoretical findings.
All Author(s) ListHuikang Liu, Jiaojiao Zhang, Anthony Man-Cho So, Qing Ling
Journal nameIEEE Transactions on Signal and Information Processing over Networks
Year2023
Month7
Day3
Volume Number9
PublisherInstitute of Electrical and Electronics Engineers (IEEE)
Pages427 - 441
eISSN2373-776X
LanguagesEnglish-United States
KeywordsConvergence, Newton method, Linear programming, Costs, Optimization, Symmetric matrices, Minimization, Decentralized optimization, convergence rate, Newton's method, compressed communication

Last updated on 2023-18-12 at 11:02