Class 6, C&O 466/666



  1. Line Search Methods continued ...(Chapter 3)
    1. Example
        minimize g(x) = sumim log( e [aiTx - bi] +e [-aiTx + bi] )
        using Newton's method with backtracking. See the matlab program.
    2. Rate of Convergence
      1. Theorems 3.3,3.4 (linear convergence rate for steepest descent - dependence on condition number of Hessian)
      2. Theorem 3.5 (Dennis-More condition - quasi-Newton method Hessian approximate needs (asymptotic) convergence in search direction for superlinear convergence)
      3. Theorem 3.7 (with proof! - quadratic convergence of Newton's method near optimum.)
      Reference (for your interest only): CMP 1 789 601 (2001:03) 90C30 (65K05), Shi, Yixun(1-BLBS-CS), Globally convergent algorithms for unconstrained optimization. (English. English summary) Comput. Optim. Appl. 16 (2000), no. 3, 295--308