ICCOPT 2013 Talk, Room 1.5, Monday, July 29, 16:30-18:00

 Speaker: Yves Lucet, University of British Columbia, Canada
 Title: Derivative-free optimization via proximal point methods
 Co-authors: Warren Hare

 Abstract:
Scientific Program

Many standard techniques in Derivative-Free Optimization (DFO) are based on using model functions to approximate the objective function, and then applying classic optimization methods on the model function. For example, the details behind adapting steepest descent, conjugate gradient, and quasi-Newton methods to DFO have been studied in this manner. In this paper we demonstrate that the proximal point method can also be adapted to DFO. To that end, we provide a derivative-free proximal point (DFPP) method and prove convergence of the method in a general sense. In particular, we give conditions under which the gradient values of the iterates converge to 0, and conditions under which an iterate corresponds to a stationary point of the objective function.


 Talk in: Organized Session Mon.C.15 Model based methods for derivative-free optimization
 Cluster: Derivative-free and simulation-based optimization


 Go to: Mon.C
 Go to: unframed Scientific Program

 Go to: ICCOPT 2013 Main Webpage