Class 6

Line Search Methods continued ...(Chapter 3)

Rate of Convergence

Theorems 3.3,3.4 (linear convergence rate for steepest descent 
dependence on condition number of Hessian)

Theorem 3.5 (DennisMore condition  quasiNewton method Hessian
approximate needs (asymptotic) convergence in search direction for
superlinear convergence)

Theorem 3.7 (with proof!  quadratic convergence of Newton's method near
optimum.)
Reference (for your interest only):
CMP 1 789 601 (2001:03) 90C30 (65K05),
Shi, Yixun(1BLBSCS),
Globally convergent algorithms for unconstrained optimization. (English.
English summary)
Comput. Optim. Appl. 16 (2000), no. 3, 295308