Communication-Adaptive Stochastic Gradient Methods for Distributed Learning

Volume: 69, Pages: 4637 - 4651
Published: Jan 1, 2021
Abstract
This paper targets developing algorithms for solving distributed learning problems in a communication-efficient fashion, by generalizing the recent method of lazily aggregated gradient (LAG) to deal with stochastic gradient - justifying the name of the new method LASG. While LAG is effective at reducing communication without sacrificing the rate of convergence, we show it only works with deterministic gradients. We introduce new rules and...
Paper Details
Title
Communication-Adaptive Stochastic Gradient Methods for Distributed Learning
Published Date
Jan 1, 2021
Volume
69
Pages
4637 - 4651
Citation AnalysisPro
  • Scinapse’s Top 10 Citation Journals & Affiliations graph reveals the quality and authenticity of citations received by a paper.
  • Discover whether citations have been inflated due to self-citations, or if citations include institutional bias.