Jorge Cortés
Professor
Cymer Corporation Endowed Chair
Distributed saddle-point subgradient algorithms with Laplacian averaging
D. Mateos-Nuñez, J. Cortés
IEEE Transactions on Automatic Control 62 (6) (2017), 2720-2735
Abstract
We present distributed subgradient methods for min-max problems with
agreement constraints on a subset of the arguments of both the
convex and concave parts. Applications include constrained
minimization problems where each constraint is a sum of convex
functions in the local variables of the agents. In the latter case,
the proposed algorithm reduces to primal-dual updates using local
subgradients and Laplacian averaging on local copies of the
multipliers associated to the global constraints. For the case of
general convex-concave saddle-point problems, our analysis
establishes the convergence of the running time-averages of the
local estimates to a saddle point under periodic connectivity of the
communication digraphs. Specifically, choosing the gradient
step-sizes in a suitable way, we show that the evaluation error is
proportional to 1/\sqrt{t}, where t is the iteration step. We
illustrate our results in simulation for an optimization scenario
with nonlinear constraints coupling the decisions of agents that
cannot communicate directly.
pdf   |   ps.gz
Mechanical and Aerospace Engineering,
University of California, San Diego
9500 Gilman Dr,
La Jolla, California, 92093-0411
Ph: 1-858-822-7930
Fax: 1-858-822-3107
cortes at ucsd.edu
Skype id:
jorgilliyo