Class 6



  1. Line Search Methods continued ...(Chapter 3)
    1. Rate of Convergence
      1. Theorems 3.3,3.4 (linear convergence rate for steepest descent - dependence on condition number of Hessian)
      2. Theorem 3.5 (Dennis-More condition - quasi-Newton method Hessian approximate needs (asymptotic) convergence in search direction for superlinear convergence)
      3. Theorem 3.7 (with proof! - quadratic convergence of Newton's method near optimum.)
      Reference (for your interest only): CMP 1 789 601 (2001:03) 90C30 (65K05), Shi, Yixun(1-BLBS-CS), Globally convergent algorithms for unconstrained optimization. (English. English summary) Comput. Optim. Appl. 16 (2000), no. 3, 295--308