Jorge Cortés

Professor

Cymer Corporation Endowed Chair





Nesterov acceleration for equality-constrained convex optimization via continuously differentiable penalty functions
P. Srivastava, J. Cortés
IEEE Control Systems Letters 5 (2) (2021), 415-420


Abstract

We propose a framework to use Nesterov's accelerated method for constrained convex optimization problems. Our approach consists of first reformulating the original problem as an unconstrained optimization problem using a continuously differentiable exact penalty function. This reformulation is based on replacing the Lagrange multipliers in the augmented Lagrangian of the original problem by Lagrange multiplier functions. The expressions of these Lagrange multiplier functions, which depend upon the gradients of the objective function and the constraints, can make the unconstrained penalty function non-convex in general even if the original problem is convex. We establish sufficient conditions on the objective function and the constraints of the original problem under which the unconstrained penalty function is convex. This enables us to use Nesterov's accelerated gradient method for unconstrained convex optimization and achieve a guaranteed rate of convergence which is better than the state-of-the-art first-order algorithms for constrained convex optimization. Simulations illustrate our results.

pdf

Mechanical and Aerospace Engineering, University of California, San Diego
9500 Gilman Dr, La Jolla, California, 92093-0411

Ph: 1-858-822-7930
Fax: 1-858-822-3107

cortes at ucsd.edu
Skype id: jorgilliyo