Class 2



  1. Overview of Algorithms
    1. Importance of convex functions and sets for global optima
    2. line search and trust region methods
      1. line search: steepest descent, Newton's method (scale invariance)
    3. quasi-Newton methods, trust region methods (comparisons)
    4. scaling
    5. rates of convergence
  2. Line Search Methods, of type where pk is the search direction and alphak is the step length
    1. Step Length
      1. Wolfe conditions (geometric interpretations)
          sufficient decrease - step is not too large curvature condition - step is not too small
      2. existence of step lengths
    2. Convergence of Line Search Methods (cos theta geometric interpretation)
    3. Rate of Convergence
      1. linear convergence rate for steepest descent - dependence on condition number of Hessian
      2. Dennis-More condition - quasi-Newton method Hessian approximate needs (asymptotic) convergence in search direction for superlinear convergence
      3. with proof! - quadratic convergence of Newton's method near optimum.