CO466/666
Winter 2012

Continuous Optimization

Instructor Henry Wolkowicz (MC6065, x35589)

  • Time: 12:30-1:20PM, MWF
  • Location: MC 1085
  • Office Hours: Monday 3:30-4:30PM, Thursday 3:30-4;30PM
  • Course Outline


  • Text: Numerical Optimization, Edition 2 (with TOC/outline), by Jorge Nocedal, and Stephen Wright, 2006, Springer Verlag. ( typos/comments on the text! )
  • Further References
  • Marking Scheme: HW 50%; Final 50%
  • HOMEWORK :
    HW1; HW2;
    1. Homework #1 (Unconstrained Optimization; Solutions )
      Due: Friday, Jan. 20, 2012 (before start of class).
      Reading
      • Chapters 1,2,3 in the text (pages 1-60)
      • For your interest only:
      Problems
      1. Go to the NEOS Solvers web page. Use the unconstrained optimization package NMTR to solve the extended Rosenbrock function (see page 191 in the text, edition 2) with n=48 variables, parameter values alpha=20 and alpha=350.
        It seems that NMTR does not have a sample anymore at NEOS. Please Use the MATLAB solver instead if NMTR does not work for you. You can use the MATLAB function/command: fminunc.
        Hand in (or email in) a copy of the outputs you get by email. In each case, point out the solution and the optimal value.
      2. Solve the problems at this link
    2. Homework #2 (Unconstrained Optimization)
      Due: Mon. Jan 30, (by start of class)
      Reading
      • Chapter 4 in the text
      Problems
      1. MATLAB
        1. Enter
          help optim
          You will see a list of the optimization functions that are available with the optimization toolkit.
        2. Now try the command
          demo toolbox optim
          Then choose: Minimization of the Banana Function.
          bandem
          to test out several unconstrained algorithms on the banana function or Rosenbrock's function. This is a classical example of a function with a narrow valley that exhibits ``very'' slow convergence for steepest descent type methods.
          1. How many function evaluations did the minimization take for: steepest descent; simplex search; Broyden-Fletcher-Goldfarb-Shanno; Davidon-Fletcher-Powell; Levenberg-Marquardt?
      2. Link to further problems (Hint: Please note another equivalent characterization for strongly convex functions involves the first order characterization:
        f(y) g.e. f(x) + grad f(x)T(y-x) + (m/2)||y-x||2 Also, there was a typo in the question, i.e. the function is a strongly convex quadratic function. link to solutions pdf file
    3. Homework #3 (CG, NLLS)
      Due: Wed, February 8, 2012 (at start of class)
      Reading
      • Chapters: 4 (TR, selected parts); 5 (CG); 7 (LS, selected parts); 10 (GN);
      Problems
      1. Consider the MATLAB file cgtest.m. This file generates a linear system Ax=b where A is positive definite with only 3 distinct eigenvalues. It is then solved using a variant of the CG strategy.
        1. Run the file to generate a random problem and verify that you have the solution. Save the data in a mat file and send in by email.
        2. Bonus 1: Can you prove that you always get convergence to the solution of Ax=b? (Hint: Try it on a two dimensional example; draw the eigenvectors. Also, consider the convergence of steepest descent when eigenvectors are involved.) And, in what order should the eigenvalues be used so that a best approximation is obtained at each iteration.
      2. Do Problem 5.1 in the text to implement Algorithm 5.2.
        Bonus 2: Then try the same with the preconditioned CG Algorithm 5.3. What would a reasonable preconditioner be? The diagonal elements? Or the diagonal with the sup and sub diagonals?
      3. link to solutions pdf file, and links to sample CG and PCG solutions file.

    4. Homework #4 Homework #4 ( Nonlinear Programming, Optimality Conditions)
      Due: Friday Feb. 17, 2012 (at start of class)
      Reading
      • Chapter 12 in the text
      Problems
      1. MATLAB
        1. Test out the data fitting demo (medium-scale nonlinear data fitting, fitdemo).
          • This is one of the demos in the optimization toolkit. The example demonstrates fitting a nonlinear function to a set of data. Provide the plots of the data points; then provide the plots, number of iterations, function evaluations, and norm of the residual of the fit when using the Gauss-Newton method (with mixed polynomial interpolation) and, also when using unconstrained optimization. ( See the Least Squares (Model Fitting) Algorithms documentation in MATLAB.)
            Include the algebraic form of the nonlinear function that is used.
      2. Link to Problems pdf file

    5. Homework #5 Due: Friday Mar. 9, 2012 (at start of class) pdf file




  • Last Modified:  Wednesday 29 February 2012