ICCOPT 2013 Talk, Room 2.2, Tuesday, July 30, 16:30-18:00

 Speaker: Sangwoon Yun, Sungkyunkwan University, South Korea
 Title: Incrementally updated gradient methods for nonsmooth minimization
 Co-authors: Paul Tseng

 Abstract:
Scientific Program

We consider incrementally updated gradient methods for minimizing the sum of smooth functions and a convex function. This method can use a (sufficiently small) constant stepsize or, more practically, an adaptive stepsize that is decreased whenever sufficient progress is not made. We show that if the gradients of the smooth functions are Lipschitz continuous on $R^n$, then every cluster point of the iterates generated by the method is a stationary point. If in addition a local Lipschitz error bound assumption holds, then the method is linearly convergent. We also propose a new incrementally updated gradient method that uses much less storage and prove global convergence of the method.


 Talk in: Organized Session Tue.C.22 Coordinate descent and incremental gradient methods for nonsmooth optimization
 Cluster: Convex and nonsmooth optimization


 Go to: Tue.C
 Go to: unframed Scientific Program

 Go to: ICCOPT 2013 Main Webpage