Continuous Optimization - 769 - Winter 2002

This course provides a rigorous up-to-date treatment of topics in Continuous Optimization (Nonlinear Programming). This includes a hands-on approach with exposure to existing software packages.


Contents




Conduct of Course

  • Instructor:
  • Office Hours:
  • Lectures:
  • Text:
  • Additional References:
    1. Nonlinear Programming, by Dimitri P. Bertsekas, 1999, Athena Scientific.
    2. Practical Optimization, P.E. Gill, W. Murray, M.H. Wright.
    3. Numerical Methods for Unconstrained Optimization and Nonlinear Equations, J.E. Dennis and R.B. Schnabel
    4. Nonlinear Programming: theory and algorithms, M.S. Bazaraa, C.M. Shetty, Sherali
    5. Linear and Nonlinear Programming, D.G. Luenberger, Second Edition
    6. Nonlinear Programming, Peressini, Sullivan, Uhl
    7. Practical Methods of Optimization, R. Fletcher
    8. Mathematical Programming Methods, G, Zoutendijk
    9. Nonlinear Programming, G.P. McCormick
    10. Mathematical programming : theory and algorithms, M. Minoux (QA402.5.M5613)
    11. Optimization by Vector Space Methods, D.G. Luenberger
    12. Convex Analysis, R.T. Rockafellar
    13. Theory of Extremal Problems, A.D. Ioffe and V.M. Tihomirov

  • Course Home Page, 769 http://orion.uwaterloo.ca/~hwolkowi/henry/teaching/w02/fields.w02/readme.html
  • Term Work:
  • Final Exam:
  • Marking Scheme:

    COURSE TOPICS OUTLINE - 769

    We will choose topics from the text. You can see a complete list if you like.

    HOMEWORK LIST- 769

    1. Homework (Unconstrained Optimization)
      Due: Monday, January 21, 2002
      Reading
      Problems
      • Go to the NEOS Solvers web page. Use the unconstrained optimization package CGplus to solve the extended Rosenbrock function (see e.g. page 248 in the text or the sample submission for CGplus) with n=26 variables and method=2 (Polak-Ribiere).
        Then solve the problem using the unconstrained optimization package NMTR.
        Hand in (or email in) a copy of the outputs you get by email. In each case, point out the solution, optimal value, and any parameter values you choose.
      • Pages 30-31, Problems 2.1, 2.7, 2.9, 2.10.
      solutions
    2. Homework (Unconstrained Optimization)
      Due: Monday, January 28, 2002.
      Reading Problems
      1. MATLAB
        1. Matlab appears to be on all the Fields computers. Enter the command matlab to start up the matlab session. Then enter
          help optim
          You will see a list of the optimization functions that are available with the optimization toolkit.
        2. Now try the command
          optdemo
          This will bring up a menu. You can try the four choices. The first choice is a tutorial. This demonstrates several of the optimization functions.
        3. The second choice in the optdemo is the minimization of the banana function or Rosenbrock's function. This is a classical example of a function with a narrow valley that exhibits ``very'' slow convergence for steepest descent type methods. Try out several methods to minimize this function.
          1. Submit a contour plot of the banana function for variable values between $-4,+4.$
          2. How many function evaluations did the minimization take for: steepest descent; simplex search; Broyden-Fletcher-Golfarb-Shanno; Davidon-Fletcher-Powell; Levenberg-Marquardt? (Please specify the line search you used.)
        Note: You can modify the matlab programs. You can see what the matlab routines are doing by looking at the matlab m-files. To do this change directory using: cd /software; cd matlab; cd distribution; cd toolbox; cd optim. In particular, there is an m-file called bandemo.m.
      2. Pages 62-63, Problems 3.2, 3.3, 3.5 Comments on Solutions to Homework 2 and some nice pictures/plots thanks to John Gatsis: banana function contour; banana function mesh; banana function finer contour; banana function wider mesh - and a matlab function for the plots
      3. Homework (BFGS and TRS)
        Due: Monday, Feb. 4, 2002.
        Reading
        • Chapter 4 in the text (pages 64-97)
        Problems
        1. MATLAB
          Solve problem 3.9 on page 63 of the text. But do not program the BFGS algorithm yourself but rather use the matlab function for BFGS, i.e. you can find this by running optdemo and using the information, e.g. lines like
          [x,fval,exitflag,output] = fminunc({f,GRAD},x,OPTIONS);
          Output the values to verify that they indeed stay positive.

          Note: You can modify the matlab programs. You can see what the matlab routines are doing by looking at the matlab m-files. To do this change directory using: cd /software; cd matlab; cd distribution; cd toolbox; cd optim. In particular, there is an m-file called bandemo.m.

        2. Page 97, Problem 4.1
          Comments on Solutions to Homework 3
        3. Homework (Conjugate Gradient Methods, Practical Newton Methods)
          Due: Monday, Feb. 11, 2002.
          Reading
          • Chapters 5,6 in the text (pages 100-162)
          Problems
          1. MATLAB
            Use the conjugate gradient algorithm in the matlab program cgtrust.m to find the minimum of the banana function from the initial points: (-2,-3), (0,0), (-1.9,2), (-1.9,20).
              (BEWARE: In cgtrust.m, the initial radius of the trust region is set as the norm of the initial guess (x1i,x2i). Since this is zero for the (0,0) case, a divide by zero message appears. Quick-Fix correction:
              Line 56 ...old version... trrad=norm(x0);
              New version .............. trrad=norm(x0)+0.0001; % i.e. Add a small value
              (Thanks to Anton.))

            Provide the number of major iterations and the final point to four decimals. Explain the behaviour.
            You will need the files dirdero.m and README. (You can use the default parameter values to start.)
          2. Page 133, Problem 5.2, 5.4 Comments on Solutions to Homework 4.
          3. Homework (NLLS, Nonlinear Equations, Nonlinear Programming)
            Due: Monday, Feb. 18, 2002.
            Reading
            • Chapters 10,11, and Sections 12.1,12.2 in the text (pages 250-329)
            Problems
            1. MATLAB (try to use matlab 6)
              1. Test out the data fitting demo. This is one of the command line demos in the optimization toolkit. The example demonstrates fitting a nonlinear function to a set of data. Provide the plots of the data points; then provide the plots, number of iterations, function evaluations, and norm of the residual of the fit when using the Gauss-Newton method (with mixed polynomial interpolation) and, also when using sequential quadratic programming. Include the algebraic form of the nonlinear function that is used.
            2. Problems from text:
              1. Page 274-5 Problem 10.1,10.6.
              2. Page 311, Problem 11.5,11.7.
              3. Page 358, Problems 12.1,12.2.
            Files with comments on Solutions to Homework 5.
          4. Homework (Constrained Nonlinear Programming and Convex Analysis)
            Due: Monday 25, 2002.
            Reading Problems
            1. Constraint Qualifications
              1. Consider the general NLP with both equality and inequality constraints. Define the tangent cones and the linearizing cones at a feasible point x. Then show that the tangent cone is a subset of the linearizing cone.
              2. Consider the NLP:
                  miniminze f(x,y,z)= ex + e-y+z
                  subject to:
                    g1(x,y,z)= ex -1 <= 0
                    g2(x,y,z)= e-y -1 <= 0
                    g3(x,y,z)= (x -1)2 + y2 -1 <= 0
                    g4(x,y,z)= x2+y2+e-z -1 <= 0
                  1. Verify that the NLP is a convex program.
                  2. Use the KKT conditions to test for optimality at the origin (0,0,0)T.
                  3. Use the Tangent cone conditions (often called the Pshenichnyi conditions in the literature) to test for optimality at (0,0,0)T.
                  4. Is there a contradiction between the two tests for optimality? Find the linearizing cone at (0,0,0)T. Then find a direction in the linearizing cone that is not a tangent direction and use this to explain the discrepency.
            2. MATLAB (try to use matlab 6)
              1. Use the constrained optimization function fmincon in the optimization toolbox to solve the above convex program (problem 1.2 above). Can you get high accuracy in the solution?
            3. Problems from text:
              1. Page 359, Problem 12.10
              2. Page 360, Problem 12.17
            4. Homework (Constrained Nonlinear Programming, Convex Analysis, and Linear Programming)
              Due: Monday, March 4, 2002.
              Reading
              • Chapters 13,14.1 in the text (pages 363-404)
              Problems
              1. Page 360, Problems 12.18,12.20,12.21.
              2. Solve the above three problems using the MATLAB: (i) Levenberg-Marguardt algorithm lsqnonlin; (ii) the trust region algorithm in the directory distribution/toolbox/optim/private (file trust.m); and (iii) the trust region method in fmincon. Discuss the differences if any.
              3. Page 393, Problem 13.2.
                (Note that the standard form of a linear program is given on page 364.)
                Page 393, Problem 13.3. Then find a dual linear program for the LP in 13.2. (Use the generalized Farkas' Lemma or the min-max argument if you like.)
            5. Homework (Linear Programming and Interior-Point Methods)
              Due: Monday, March 18, 2002.
              Reading
              • Chapters 14 (complete) in the text (pages 405-419)
              Problems
              1. Page 417-418, Problems 14.1,14.7.
              2. Page 419, Problem 14.12. Decide on reasonable choices for m,n. (If you choose large values, then send the output to me by email.) Program the algorithm in MATLAB as stated in the question. Compare your answers with:
                • First: use the matlab program linprog. (Ensure you are using the interior-point default.)
                • Second: use NEOS - with the PCx LP package.
            6. Homework (Fundamentals of Algorithms for NLP and Quadratic Programming)
              Due: Monday, March 25, 2002.
              Reading
              • Chapters 15,16 in the text (pages 420-467 and 481-485)
              Problems
              1. Problems from text:
                1. Page 438 Problems 15.1,15.3
                2. Page 486-487 Problems 16.1,16.2,16.8 (In problem 16.1, please replace maximization by minimization since the objective function in this quadratic program in strictly convex.)
              2. Use the matlab command quadprog to solve problem 16.9 in the text. Is there different behaviour for the different starting points?
              3. Homework (Penalty and Barrier Methods; Sequential Quadratic Programming)
                Due: Monday, April 15, 2002.
                Reading
                • Chapters 17,18 in the text (pages 491-575)
                Problems
                1. Problems from text:
                  1. Pages 526-527, Problems 17.1,17.2,17.3
                  2. Page 574, Problem 18.2
                    (You can use MATLAB, e.g. you can use the MATLAB QP program to solve the QP subproblems.)
                  3. Bonus Problem: Solve Problem 18.2 above using the SQQP approach, i.e. replace the QP subproblem with the QQP subproblem (the quadratic-quadratic subproblem) and solve its SDP relaxation to find the search directions. MATLAB has an LMI toolbox that will solve the SDP subproblems. Otherwise, use NEOS to perform a few iterations.
                    Compare the performance of the iterations. What can you conclude? Any explanations?



                Mail to: hwolkowicz@uwaterloo.ca
                (C) Copyright Henry Wolkowicz, 2002.
                , by Henry Wolkowicz