CO466/666
Winter 2006

Continuous Optimization
Instructor Henry Wolkowicz (MC6065, x5589)

  • Time: 1:00-2:20PM TTh
  • Location: RCH206
  • Office Hours: Mon 1-2 and Fri 11-12
  • final exam: Mon. Apr 10, 12:30-3:00PM, in RCH 209
  • latest grades


  • Text: (on 3 hour reserve; call number QA402.5 .N62 1999) Numerical Optimization, by Jorge Nocedal, and Stephen Wright, 1999, Springer Verlag. ( Please look at (and send me any) typos/comments on the text! And more typos at author's website. )
  • Further References
  • Marking Scheme: HW 50%; Final 50%
  • HOMEWORK: HW1; HW2; HW3; HW4; HW5; HW6 (optional);
    1. Homework #1 (Unconstrained Optimization)
      Due: Thur Jan12, 1PM.
      Reading
      Problems
      1. Go to the NEOS Solvers web page. Use the unconstrained optimization package NMTR to solve the extended Rosenbrock function (see page 248 in the text) with n=48 variables, parameter values alpha=10 and alpha=250.
        Hand in (or email in) a copy of the outputs you get by email. In each case, point out the solution and the optimal value.
      2. Pages 30-31, Problems 2.1, 2.3, 2.7, 2.9.
      3. Diffraction Law: Let x and y be two points on the plane that lie on opposite sides of the horizontal axis. Assume that the speed of light from x and from y to the horizontal axis is v and w, respectively. And assume that light reaches a point from other points along paths of minimum travel time. Find the path that a ray of light would follow from x to y.
      Comments on Solutions

    2. Homework #2 (Unconstrained Optimization)
      Due: Thurs Jan 26, 1PM
      Reading
      Problems
      1. MATLAB
        1. local MATLAB tutorial (video based)
        2. Matlab appears to be on all the Solaris systems in the undergrad environments, e.g. cpu01.student.math and magnus. You may have to go see Lori Suess in MFCF to set your password properly. (The path is:
          /.software/local/.admin/bins/bin/matlab
          The default should now be version matlab-6.5. If not you could try the path software/matlab-6.5/bin/matlab or use "setenv PATH `/bin/showpath matlab-6.5 current`")
          Enter the command matlab to start up the matlab session. Then enter
          help optim
          You will see a list of the optimization functions that are available with the optimization toolkit.
        3. Now try the command
          optdemo
          This will bring up a menu. You can try the four choices. The first choice is a tutorial. This demonstrates several of the optimization functions.
        4. The second choice in the optdemo is the minimization of the banana function or Rosenbrock's function. This is a classical example of a function with a narrow valley that exhibits ``very'' slow convergence for steepest descent type methods. Try out several methods to minimize this function.
          1. Submit contour and surface plots of the banana function for variable values between -15,+15 (Use the matlab commands: help plot; or: demo )
          2. How many function evaluations did the minimization take for: steepest descent; simplex search; Broyden-Fletcher-Golfarb-Shanno; Davidon-Fletcher-Powell; Levenberg-Marquardt? (Please specify the line search you used.)
        Note: You can modify the matlab programs. You can see what the matlab routines are doing by looking at the matlab m-files. To do this change directory using: cd /software; cd matlab; cd distribution; cd toolbox; cd optim. In particular, there is an m-file called bandemo.m.
      2. Consider the minimization of a quadratic function q(x) with positive definite Hessian matrix Q. Suppose that the function is scaled using the diagonal matrix from the inverse of the square roots of the diagonal elements of Q, i.e. x is replaced by D-1/2x, where D is a diagonal matrix with Qii on the diagonal. Show that if the dimension n=2, then this reduces the condition number of the Hessian and improves the convergence rate of steepest descent. (This need not be true if n>2.)
      3. Pages 62-63, Problems 3.2, 3.3, 3.6, 3.8
      4. COMMENTS
        • Here is a MATLAB code that shows a mesh plot of the extended Rosenbrock function and a plot of the function in the direction of steepest descent at the starting point (-1,-1).
        • Here is a MATLAB file with Cauchy's steepest descent for a quadratic function that illustrates both the worst and best case behaviour;
        • and here is another simple matlab code for Cauchy's steepest descent with a quadratic function. One can easily change the condition number and see the change in behaviour.
        • and here is another simple matlab code for Newton's method with a cubic function. One can change the starting point to see the change in the first iteration.
      Comments on Solutions
    3. Homework #3 (BFGS and Newton; Line Search and Trust Region Techniques)
      Due: Tuesday February 14, 2006 (at start of class)
      Reading
      • Chapter 4 in the text (pages 64-97)
      Problems
      1. MATLAB
        Solve problem 3.9 on page 63 of the text. But do not program the BFGS algorithm yourself but rather use the matlab function for BFGS, i.e. you can find this by running optdemo and using the information, e.g. lines like
        [x,fval,exitflag,output] = fminunc({f,GRAD},x,OPTIONS);
        Output the values to verify that they indeed stay positive.
        Note: You can modify the matlab programs. You can see what the matlab routines are doing by looking at the matlab m-files. To do this change directory using: cd /software; cd matlab; cd distribution; cd toolbox; cd optim. In particular, there is an m-file called bandemo.m as well as the file fminunc.m
      2. Page 97, Problem 4.1 ( see this MATLAB file for drawing the curve in the problem); Page 99, Problem 4.11.
      3. Derive the algorithm for the Newton step used in the More-Sorensen algorithm for the trust region subproblem, as presented in class on Feb. 3, 2005, i.e. in equation (4.27) on page 82.
          The following MATLAB files might be of interest:
        • gentrs.m, generates a random TRS of size n;
        • plotrs.m, finds the optimal lambda using a plot of ||p(lambda)||;
        • tplottrs.m, finds the optimal lambda using a plot of 1/||p(lambda)||;
        • iterN.m finds the optimal lambda using Newton's method (but no safeguarding is included!!!)
        • hardcase.m changes g for the hardcase and plots the curves
      4. Get the matlab program with the values for m,n,A,b given in the mat file, data.mat (you must save this 'binary' file). Use the files to
        minimize g(x) = sumi=1m log( e [aiTx - bi] +e [-aiTx + bi] )
        using Newton's method with backtracking. Which line search conditions are being guaranteed in this matlab program? What happens if you change the backtracking to guarantee the Strong Wolfe Conditions?
        Compare the optimal value with the least squares solution min ||Ax-b||.

    4. Homework #4 Homework #4 (TR, NLLS, Nonlinear Equations, Nonlinear Programming)
      Due: Thursday March 2, 2006 (at start of class)
      Reading
      • Chapters 6,10,11 in the text
      Problems
      1. MATLAB (try to use matlab 6 or 7)
        1. Test out the data fitting demo (medium-scale nonlinear data fitting). This is one of the demos in the optimization toolkit. The example demonstrates fitting a nonlinear function to a set of data. Provide the plots of the data points; then provide the plots, number of iterations, function evaluations, and norm of the residual of the fit when using the Gauss-Newton method (with mixed polynomial interpolation) and, also when using unconstrained optimization.
          Include the algebraic form of the nonlinear function that is used.
      2. Problems from text:
        1. Page 248, Problem 9.10.
        2. Page 274, Problem 10.4.
        3. Page 311, Problem 11.8.

    5. Homework #5 (Constrained Nonlinear Programming and Convex Analysis)
      Due: Tuesday March 20, 2006 (at start of class)
      Reading
      Problems
      1. Constraint Qualifications
        1. Consider the general NLP with both equality and inequality constraints.
          1. Define the tangent cones and the linearizing cones at a feasible point x.
          2. show that the tangent cone is a subset of the linearizing cone.
          3. Is the tangent cone always closed? Why?
          4. Is the linearizing cone always closed? Why?
          5. Give an example of a nonconvex feasible set, F, and a point x in F, in dimension at least three, for which the tangent cone and the linearizing cone are not the same; and, for this feasible set and point, give the set of linear objective functions for which x is optimal but the KKT optimality conditions fail.
        2. Consider the NLP:
            miniminze f(x,y,z)= ex + e-y+z
            subject to:
              g1(x,y,z)= ex -1 <= 0
              g2(x,y,z)= e-y -1 <= 0
              g3(x,y,z)= (x -1)2 + y2 -1 <= 0
              g4(x,y,z)= x2+y2+e-z -1 <= 0
            1. Verify that the NLP is a convex program.
            2. Use the KKT conditions to test for optimality at the origin (0,0,0)T.
            3. Use the Rockafellar/Pshenichnyi conditions (polar of tangent cone condition) to test for optimality at (0,0,0)T.
            4. Is there a contradiction between the two tests for optimality? Find the linearizing cone at (0,0,0)T. Then find a direction in the linearizing cone that is not a tangent direction and use this to explain the discrepency.
      2. MATLAB (try to use matlab 7)
        1. Use the constrained optimization function fmincon in the optimization toolbox to solve the above convex program (problem 1.2 above). Can you get high accuracy in the solution? (Start far from the solution, e.g. at (10,10,100).)
      3. Problems from text:
        1. Page 359-360, Problems 12.10, 12.17;
        2. Pages 417-418 Problems 14.1,14.6,14.9.
      Comments on Solutions for HW5

    6. Homework #6 (OPTIONAL)
      Do NOT Hand IN.
      • Reading
        • Chapters 15,17,18 in the text
      • Problems:
        • Chapter 15: Problems 15.2
        • Chapter 17: Problems 17.2, 17.3
        • Chapter 18: Problems 18.1; and describe the Maratos Effect and one possible "fix" for SQP methods.




  • Last Modified:  Friday 7 April 2006