C&O466/666
Winter 2005

Continuous Optimization
Course Information





Go to Math Faculty   |  Go to Combinatorics and Optimization

Conduct of Course

  • Instructor:
      Professor Henry Wolkowicz, MC 6065, ext. 5589,
  • Text: (on 3 hour reserve call number: QA 402.5.N62 1999)
  • Additional References:
    1. Nonlinear Programming, by Dimitri P. Bertsekas, 1999, Athena Scientific.
    2. Practical Optimization, P.E. Gill, W. Murray, M.H. Wright.
    3. Numerical Methods for Unconstrained Optimization and Nonlinear Equations, J.E. Dennis and R.B. Schnabel
    4. Nonlinear Programming: theory and algorithms, M.S. Bazaraa, C.M. Shetty, Sherali
    5. Linear and Nonlinear Programming, D.G. Luenberger, Second Edition
    6. Nonlinear Programming, Peressini, Sullivan, Uhl
    7. Practical Methods of Optimization, R. Fletcher
    8. Mathematical Programming Methods, G, Zoutendijk
    9. Nonlinear Programming, G.P. McCormick
    10. Mathematical programming : theory and algorithms, M. Minoux (QA402.5.M5613)
    11. Optimization by Vector Space Methods, D.G. Luenberger
    12. Convex Analysis, R.T. Rockafellar
    13. Theory of Extremal Problems, A.D. Ioffe and V.M. Tihomirov

  • Course Home Page, C&O 466/666, http://orion.uwaterloo.ca/~hwolkowi/henry/teaching/w05/666.w05/readme.shtml
  • Term Work:
      will consist of homework problems . (See attached schedule of assignments.)
  • Final Exam:
      A 3-hour exam, date ????.
  • Marking Scheme:
    • Homework............ 50%
    • Final Exam.......... 50%

    COURSE TOPICS OUTLINE - C &O 466/666

    We will choose topics from the text. The complete list is included for now.


    Background Material Chapter Fifteen Chapter Fourteen Chapter Thirteen Chapter Twelve Chapter Eleven Chapter Ten Chapter Nine Chapter Eight Chapter Seven Chapter Six Chapter Five Chapter Four Chapter Three Chapter Two Chapter Eighteen Chapter Seventeen Chapter Sixteen Chapter One




    HOMEWORK LIST- C &O 466/666

    1. Homework #1 (Unconstrained Optimization)
      Due: Thursday January 13, 2005 (at start of class)
      Reading
      Problems
      • Go to the NEOS Solvers web page. Use the unconstrained optimization package CGplus to solve the extended Rosenbrock function (see e.g. page 248 in the text or the sample submission for CGplus) with n=48 variables, parameter values alpha=10 and alpha=250, and method=2 (Polak-Ribiere).
        Then solve the problem using the unconstrained optimization package NMTR.
        Hand in (or email in) a copy of the outputs you get by email. In each case, point out the solution and the optimal value.
      • Pages 30-31, Problems 2.1, 2.3, 2.7, 2.9, 2.15.

    2. Homework #2 (Unconstrained Optimization)
      Due: Thursday January 27, 2004 (at start of class)
      Reading Problems
      1. MATLAB
        1. local MATLAB tutorial (video based)
        2. Matlab appears to be on all the Solaris systems in the undergrad environments, e.g. cpu01.student.math and magnus. You may have to go see Lori Suess in MFCF to set your password properly. (The path is:
          /.software/local/.admin/bins/bin/matlab
          You should also have an account on (the multiple processor machine) tuxor. The default should now be version matlab-6.5. If not you should try the path software/matlab-6.5/bin/matlab or use "setenv PATH `/bin/showpath matlab-6.5 current`")
          Enter the command matlab to start up the matlab session. Then enter
          help optim
          You will see a list of the optimization functions that are available with the optimization toolkit.
        3. Now try the command
          optdemo
          This will bring up a menu. You can try the four choices. The first choice is a tutorial. This demonstrates several of the optimization functions.
        4. The second choice in the optdemo is the minimization of the banana function or Rosenbrock's function. This is a classical example of a function with a narrow valley that exhibits ``very'' slow convergence for steepest descent type methods. Try out several methods to minimize this function.
          1. Submit contour and surface plots of the banana function for variable values between -15,+15 (Use the matlab commands: help plot; or: demo )
          2. How many function evaluations did the minimization take for: steepest descent; simplex search; Broyden-Fletcher-Golfarb-Shanno; Davidon-Fletcher-Powell; Levenberg-Marquardt? (Please specify the line search you used.)
        Note: You can modify the matlab programs. You can see what the matlab routines are doing by looking at the matlab m-files. To do this change directory using: cd /software; cd matlab; cd distribution; cd toolbox; cd optim. In particular, there is an m-file called bandemo.m.
      2. Pages 62-63, Problems 3.2, 3.3, 3.6, 3.8
      3. COMMENTS
        • Here is a MATLAB code that shows a mesh plot of the extended Rosenbrock function and a plot of the function in the direction of steepest descent at the starting point (-1,-1).
        • Here is a MATLAB file with Cauchy's steepest descent for a quadratic function that illustrates both the worst and best case behaviour;
        • and here is another simple matlab code for Cauchy's steepest descent with a quadratic function. One can easily change the condition number and see the change in behaviour.
    3. Homework #3 (BFGS and Newton; Line Search and Trust Region Techniques)
      Due: Tuesday February 15, 2004 (at start of class)
      Reading Problems
      1. MATLAB
        Solve problem 3.9 on page 63 of the text. But do not program the BFGS algorithm yourself but rather use the matlab function for BFGS, i.e. you can find this by running optdemo and using the information, e.g. lines like
        [x,fval,exitflag,output] = fminunc({f,GRAD},x,OPTIONS);
        Output the values to verify that they indeed stay positive.
        Note: You can modify the matlab programs. You can see what the matlab routines are doing by looking at the matlab m-files. To do this change directory using: cd /software; cd matlab; cd distribution; cd toolbox; cd optim. In particular, there is an m-file called bandemo.m as well as the file fminunc.m
      2. Page 97, Problem 4.1 ( see this MATLAB file for drawing the curve in the problem); Page 99, Problem 4.11.
      3. Derive the algorithm for the Newton step used in the More-Sorensen algorithm for the trust region subproblem, as presented in class on Feb. 3, 2005, i.e. in equation (4.27) on page 82.
          The following MATLAB files might be of interest:
        • gentrs.m, generates a random TRS of size n;
        • plotrs.m, finds the optimal lambda using a plot of ||p(lambda)||;
        • tplottrs.m, finds the optimal lambda using a plot of 1/||p(lambda)||;
        • iterN.m finds the optimal lambda using Newton's method (but no safeguarding is included!!!)
        • hardcase.m changes g for the hardcase and plots the curves

    4. Homework #4 (Constrained Nonlinear Programming and Convex Analysis)
      Due: Tuesday March 8, 2004 (at start of class)
      Reading Problems
      1. Constraint Qualifications
        1. Consider the general NLP with both equality and inequality constraints. Define the tangent cones and the linearizing cones at a feasible point x. Then show that the tangent cone is a subset of the linearizing cone.
        2. Consider the NLP:
            miniminze f(x,y,z)= ex + e-y+z
            subject to:
              g1(x,y,z)= ex -1 <= 0
              g2(x,y,z)= e-y -1 <= 0
              g3(x,y,z)= (x -1)2 + y2 -1 <= 0
              g4(x,y,z)= x2+y2+e-z -1 <= 0
            1. Verify that the NLP is a convex program.
            2. Use the KKT conditions to test for optimality at the origin (0,0,0)T.
            3. Use the Rockafellar/Pshenichnyi conditions (polar of tangent cone condition) to test for optimality at (0,0,0)T.
            4. Is there a contradiction between the two tests for optimality? Find the linearizing cone at (0,0,0)T. Then find a direction in the linearizing cone that is not a tangent direction and use this to explain the discrepency.
      2. MATLAB (try to use matlab 7)
        1. Use the constrained optimization function fmincon in the optimization toolbox to solve the above convex program (problem 1.2 above). Can you get high accuracy in the solution?
      3. Problems from text:
        1. Page 359, Problem 12.10
        2. Page 360, Problem 12.17

    5. Homework #5 (Fundamentals of Linear and Nonlinear Programming)
      Due: March 17, 2005
      1. Problems from text:
        1. Chapter 14: Problems 14.1,14.6,14.9,
        2. Chapter 15: Problems 15.1,15.7
        3. Chapter 16: Problems 16.1, 16.2
      2. Use NEOS to solve the mathematical program in problem 16.9, i.e. use one of the Nonlinearly Constrained Optimization Solvers. (Use each of the three suggested starting points.)

    6. Homework #6 (Sample Problems for Final Exam)
      Due: Do NOT Hand IN.
        Chapter 15: Problems 15.1
        Chapter 17: Problems 17.2, 17.3
        Chapter 18: Problems 18.1, Describe the Maratos Effect and one possible "fix" for SQP methods.



  • Last Modified:  Tuesday 29 March 2005