Class 1

  1. Introduction (Chapter 1)
    1. Mathematical Formulation: example, level sets
    2. Definitions: local and global optima
    3. Definitions: convexity
  2. Fundamentals of Unconstrained Optimization (Chapter 2)
    1. Example of nonlinear least squares data fitting
    2. Definitions: gradient, Hessian, Taylor's Theorem
Class 2, C&O 466/666

Class 2

  1. Fundamentals Unconstrained Opt. cont... (Chapter 2)
    1. Recognizing Solutions
      1. Definitions: gradient, Hessian, Taylor's Theorem, order notation (big and little O)
      2. directional derivative, curvature, linear model, direction of steepest descent
      3. first and second order necessary optimality conditions
      4. second order sufficient optimality conditions
Class 3, C&O 466/666

Class 3, C&O 466/666

  1. Overview of Algorithms
    1. Importance of convex functions and sets for global optima
    2. line search and trust region methods
      1. line search: steepest descent, Newton's method (scale invariance)
Class 4, C&O 466/666

Class 4, C&O 466/666

  1. Overview of Algorithms continued...
    1. quasiNewton methods, trust region methods (comparisons)
    2. scaling
    3. rates of convergence
Class 5, C&O 466/666

Class 5, C&O 466/666

  1. Line Search Methods (Chapter 3), of type where pk is the search direction and alphak is the step length
    1. Step Length
      1. Wolfe conditions (geometric interpretations)
          sufficient decrease step is not too large curvature condition step is not too small
      2. Lemma 3.1 (existence of step lengths)
    2. Convergence of Line Search Methods
      1. Theorem 3.2 (cos theta geometric interpretation)
    Class 6, C&O 466/666

    Class 6, C&O 466/666

    1. Line Search Methods continued ...(Chapter 3)
      1. Rate of Convergence
        1. Theorems 3.3,3.4 (linear convergence rate for steepest descent dependence on condition number of Hessian)
        2. Theorem 3.5 (DennisMore condition quasiNewton method Hessian approximate needs (asymptotic) convergence in search direction for superlinear convergence)
        3. Theorem 3.7 (with proof! quadratic convergence of Newton's method near optimum.)
        Reference (for your interest only): CMP 1 789 601 (2001:03) 90C30 (65K05), Shi, Yixun(1BLBSCS), Globally convergent algorithms for unconstrained optimization. (English. English summary) Comput. Optim. Appl. 16 (2000), no. 3, 295308
    Class 7, C&O 466/666

    Class 7, C&O 466/666

    1. Discussion of the derivative as a linear operator
    2. Line Search Methods continued ...(Chapter 3)
      1. Theorem 3.7 (with proof! quadratic convergence of Newton's method near optimum.)
    3. Trust Region Methods (Chapter 4)
      1. basic outline,
Class 8, C&O 466/666

Class 8, C&O 466/666

  1. Trust Region Subproblem
    1. secular equation, optimality conditions, scaling, Cauchy point
Class 9, C&O 466/666

Class 9, C&O 466/666

  1. Conjugate Direction Methods
    1. Solving Ax=b for A pos. def. Expanding subspace theorem 5.2 with proof.
Class 10, C&O 466/666

Class 10, C&O 466/666

  1. Conjugate Gradient Methods
    1. Solving Ax=b for A pos. def. Practical form. Rates of convergence (clustering eigenvalues). Preconditioning.
Class 11, C&O 466/666

Class 11, C&O 466/666

  1. Conjugate Direction Methods cont...
    1. GaussNewton method for nonlinear least squares: properties of GN direction, Theorem 10.1, convergence rate
Class 12, C&O 466/666

Class 12, C&O 466/666

  1. Nonlinear Equations
    1. Newton's Method, derivation using linearization, Theorem 11.2 WITH PROOF, convergence rate, Inexact Newton method, merit functions
    2. Continuation (Homotopy) methods, (general outline only)
Class 13, C&O 466/666

Class 13, C&O 466/666

  1. Theory of Constrained Optimization, Chap. 12
    1. Definitions: constrained program, feasible set Omega, tangent cone of Omega at x, polar cone (dual cone).
    2. Geometric necessary conditions of optimality (extension of Fermat's theorem) for min f(x) x in Omega
    3. Geometric characterization of optimality for the convex case, min f(x) x in Omega , where f convex function and Omega convex set
Class 14, C&O 466/666

Class 14, C&O 466/666

  1. Theory of Constrained Optimization, Chap. 12, cont...
    1. Definitions: local vs global minimum, smoothing problems by adding constraints, active constraints,
Class 15, C&O 466/666

Class 15, C&O 466/666

  1. Theory of Constrained Optimization, Chap. 12, cont...
    (with basic convex analysis added)
    1. Basic Hyperplane Separation Theorem (with proof)
    2. Lemma: K is a closed convex cone if and only K=K++ (for derivation of Farkas' Lemma)
    3. Linearizing Cone, Cone of Gradients, Weakest Constraint Qualification, KarushKuhnTucker conditions
Class 16, C&O 466/666

Class 16, C&O 466/666

  1. Theory of Constrained Optimization, Chap. 12, cont...
    (with basic convex analysis/duality added)
    1. Basic separation theorem
    2. Lemma: K is a ccc iff K=K++ and proof using separation theorem
    3. derivation of Farkas' Lemma (using above lemma)
Class 17, C&O 466/666

Class 17, C&O 466/666

  1. Theory of Constrained Optimization, Chap. 12, cont...
    (with basic convex analysis/duality added)
    1. generalized Farkas' Lemma (with respect to cone ordering with examples e.g. semidefinite programming)
    2. examples where Farkas' lemma fails
Class 18, C&O 466/666

Class 18, C&O 466/666

  1. Theory of Constrained Optimization, Chap. 12, cont...
    (with basic convex analysis/duality added)
    1. Lagrange Multiplier Theorem (with proof) of Generalized Convex Program
    2. dual of (generalized) convex programs (derivation using game theory)
Class 19, C&O 466/666

Class 19, C&O 466/666

  1. Theory of Constrained Optimization, Chap. 12, cont...
    (with basic convex analysis/duality added)
    1. Examples using KKT conditions
    2. Examples of duals of convex programs
    3. sensitivity interpretation of Lagrange multipliers for convex programs
Class 20, C&O 466/666

Class 20, C&O 466/666

  1. Optimality and Duality Continued...
    1. Examples of LP and convex QP duals in finite/infinite dimensions
      • Comparison of examples in Hilbert and Banach spaces, i.e. Lagrange multiplier acts using an innerproduct when the image of the constraint is an innerproduct space; the action is still a bilinear form when the Lagrange multiplier is in the dual of a normed space. (The dual of C[0,1] is BV[0,1]. The optimal value in Example 3 is attained by a discontinuous function in (22), but the value can be approximated as closely as desired by a continuous function. This does not contradict the fact that C[0,1] is a Banach space.)
Class 21, C&O 466/666

Class 21, C&O 466/666

  1. Optimality and Duality Continued...
    1. Examples of duality gaps in convex programs
    2. First and Second order optimality conditions for general NLPs
  2. Basics of Linear Programming
Class 22, C&O 466/666

Class 22, C&O 466/666

  1. Zoutendijk's Feasible Directions Method, Penalty Function Methods
Class 23, C&O 466/666

Class 23, C&O 466/666

  1. Optimality Conditions and Duality for the Trust Region Subproblem
Class 24, C&O 466/666

Class 24, C&O 466/666

  1. PrimalDual InteriorPoint Methods for Linear (and Semidefinite) Programming (via logbarrier problem)