ICCOPT 2013 Plenary Session Tue.P.AA, Auditorium A, Tuesday, July 30, 09:00-10:00

 Speaker: Yurii Nesterov, CORE, UCL, Belgium
 Title: Subgradient methods for huge-scale optimization problems
 Chair: Katya Scheinberg


 Abstract:
Scientific Program

We consider a new class of huge-scale problems, the problems with sparse subgradients. The most important functions of this type are piece-wise linear. For optimization problems with uniform sparsity of corresponding linear operators, we suggest a very efficient implementation of subgradient iterations, which total cost depends logarithmically in the dimension. This technique is based on a recursive update of the results of matrix/vector products and the values of symmetric functions. It works well, for example, for matrices with few nonzero diagonals and for max-type functions. We show that the updating technique can be efficiently coupled with the simplest subgradient methods, the unconstrained minimization method by Polyak, and the constrained minimization scheme by Shor. Similar results can be obtained for a new non-smooth random variant of a coordinate descent scheme. We present also the promising results of preliminary computational experiments and discuss an extension of this technique onto the conic optimization problems.


 Short Bio:
Scientific Program

Yurii Nesterov is a professor at the Catholic University of Louvain, Belgium, where he is a member of the Center for Operations Research and Econometrics (CORE). He is the author of 4 monographs and more than 80 refereed papers in leading optimization journals. He was awarded with the Dantzig Prize 2000 given by SIAM and the Mathematical Programming Society (for research having a major impact on the field of mathematical programming), the John von Neumann Theory Prize 2009 given by INFORMS, the Charles Broyden prize 2010 (for the best paper in Optimization Methods and Software journal), and the Honorable Francqui Chair (University of Liège, 2011-2012). The main direction of his research is the development of efficient numerical methods for convex and nonconvex optimization problems supported by a global complexity analysis. The most important results are obtained for general interior-point methods (theory of self-concordant functions), fast gradient methods (smoothing technique), and global complexity analysis of the second-order schemes (cubic regularization of the Newton's method).


 Go to: unframed Scientific Program

 Go to: ICCOPT 2013 Main Webpage