Friday, December 7, 2007
3:30 pm, MC 5158

Tutte Seminar Series
Combinatorics & Optimization
Fall 2007

Yurii Nesterov
Catholic University of Louvain

Gradient Methods for Minimizing Composite Objective Function

In this talk we present several methods for solving optimization problems with the objective function formed as a sum of two convex terms: one is smooth and given by a black-box oracle, and another is general but simple and its structure is known. It appears that, despite to the bad properties of the sum, such problems can be solved with efficiency typical for the good part of the objective. For these problems, we consider primal and dual variants of the gradient method (converges as $O\left({1 \over k}\right)$), and an accelerated multistep version, which converges as $O\left({1 \over k2}\right)$, where $k$ is the iteration counter. For all methods, we present very efficient "line search" procedures and show that the additional computational work necessary for estimating the unknown problem class parameters can only double the complexity of each iteration.