Original paper
Asynchronous Distributed Optimization Via Randomized Dual Proximal Gradient
Volume: 62, Issue: 5, Pages: 2095 - 2106
Published: Sep 8, 2016
Abstract
In this paper we consider distributed optimization problems in which the cost function is separable, i.e., a sum of possibly non-smooth functions all sharing a common variable, and can be split into a strongly convex term and a convex one. The second term is typically used to encode constraints or to regularize the solution. We propose a class of distributed optimization algorithms based on proximal gradient methods applied to the dual problem....
Paper Details
Title
Asynchronous Distributed Optimization Via Randomized Dual Proximal Gradient
Published Date
Sep 8, 2016
Volume
62
Issue
5
Pages
2095 - 2106