Jorge Cortés
Professor
Cymer Corporation Endowed Chair
Gradient sampling algorithm for subsmooth functions
D. Boskos, J. Cortés, S. Martínez
SIAM Journal on Optimization
Abstract
This paper considers non-smooth optimization
problems where we seek to minimize the pointwise
maximum of a continuously parameterized family of
functions. Since the objective function is given as
the solution to a maximization problem, neither its
values nor its gradients are available in closed
form, which calls for approximation. Our approach
hinges upon extending the so-called gradient
sampling algorithm, which approximates the Clarke
generalized gradient of the objective function at a
point by sampling its derivative at nearby
locations. This allows us to select descent
directions around points where the function may fail
to be differentiable and establish algorithm
convergence to a stationary point from any initial
condition. Our key contribution is to prove this
convergence by alleviating the requirement on
continuous differentiability of the objective
function on an open set of full measure. We further
provide assumptions under which a desired convex
subset of the decision space is rendered attractive
for the iterates of the algorithm.
pdf
Mechanical and Aerospace Engineering,
University of California, San Diego
9500 Gilman Dr,
La Jolla, California, 92093-0411
Ph: 1-858-822-7930
Fax: 1-858-822-3107
cortes at ucsd.edu
Skype id:
jorgilliyo