VIRTUAL HANDOUT for

Nonlinear Programming - C&O 367
Winter 2005

This course provides an introductory treatment of topics in Nonlinear Programming. This includes a hands-on approach with exposure to existing software packages.

Contents


Conduct of Course

  • Instructor:
  • Teaching Assistant:
  • Office Hours:
  • Lectures:
  • Text:

  • Home Page, C&O 367, http://orion.math.uwaterloo.ca/~hwolkowi/henry/teaching/w05/367.w05/readme.html
  • Term Work:
  • Midterm Exam:
  • Final Exam:
  • Marking Scheme:

    COURSE TOPICS OUTLINE - C &O 367
    (with HOMEWORK LIST)

    Lectures start Tuesday Jan 4, 2005.

    Chapter Outlines


    START HOMEWORK

    (WARNING/REMINDER: Copying assignments is contrary to
    University policy. You must work on your assignments on your own.)

    1. Homework #1 (Revision); due Tuesday January 11, 2005 (homework pdf file)
    2. Convex Sets and Functions
      • HOMEWORK:
        • due: Thursday, January 20, 2005 (at start of class)
        • Reading: Text Chapter 1 - Convex Analysis
        • References (FYI):
          • Eigenvalues are an important tool in optimization. Eigenvalue Show; this is a matlab file called eigshow.m. This file provides an eigenvalue show that is both entertaining and informative. Other such MATLAB files are available. Here is a local online tutorial on MATLAB.
          • Introduction to Convex Optimization, course notes by Stephen Boyd, including Lagrangian duality, LP duality, interior-point methods, and semidefinite programming; see also Part 3 in the notes by Lieven Vandenberghe
          • Open Problem from Mathematics Web at Elsevier:
              Suppose that f maps the interval [0,1] to Rn, is a continuous function, and maps every subinterval to a convex set.
              Is f([0,1]) a line segment?
    3. Optimality Conditions
      • HOMEWORK:
        • due: Thursday, Feb. 3, 2005 (at start of class)
            Exercises: 2.2, 2.3, 2.5, 2.6, 2.7, 2.8, 2.9
        • Reading: Text Chapter 2, Optimality Conditions
        • FYI: MATLAB
          1. local MATLAB tutorial (video based)
          2. Matlab appears to be on all the Solaris systems in the undergrad environments, e.g. cpu01.student.math and magnus. You may have to go see Lori Suess in MFCF to set your password properly. (The path is:
            /.software/local/.admin/bins/bin/matlab
            You should also have an account on (the multiple processor machine) tuxor. The default should now be version matlab-6.5. If not you should try the path software/matlab-6.5/bin/matlab or use "setenv PATH `/bin/showpath matlab-6.5 current`")
            Enter the command matlab to start up the matlab session. Then enter
            help optim
            You will see a list of the optimization functions that are available with the optimization toolkit.
          3. Now try the command
            optdemo
            This will bring up a menu. You can try the four choices. The first choice is a tutorial. This demonstrates several of the optimization functions.
          4. The second choice in the optdemo is the minimization of the banana function or Rosenbrock's function. This is a classical example of a function with a narrow valley that exhibits ``very'' slow convergence for steepest descent type methods. Try out several methods to minimize this function.
    4. Optimality Conditions cont...
      • HOMEWORK:
        • due: Thursday, Feb. 17, 2005 (This can be handed in late due to the midterm, i.e. it can be handed in up to noon on Friday Feb 18, 2005.)
            Exercises: 2.11, 2.12, 2.15, (and 2.13 as a bonus question!)
        • The midterm covers chapters 1 and 2 in the notes. The emphasis will be on the material covered in the assignments and in the lectures. (Note that several results were covered differently in the lectures, e.g. the Pshenichnyi condition; a different hyperplane separation theorem; a weakest constraint qualification.)
    5. Duality
      • HOMEWORK:
        • due: Tuesday, Mar. 8, 2005
            Exercises: 3.1, 3.2, 3.3, 3.4,
        • Reading: Text Chapter 3, Duality, Sections 3.1 to 3.4.
    6. Duality
      • HOMEWORK:
        • due: Thursday, Mar. 17, 2005
          • Use three iterations of Zoutendijk's Feasible direction method to approximately solve the problem:
            min -x1-x2
            s.t. x12+x22-1 &le 0
            -x1+x22 &le 0
            Use 0 as the starting point. Note that the search direction subproblem should use |di| &le 1, for all i, i.e. a bound on the infinity-norm (not on the 1-norm).
          • Exercises: 4.1, 4.2, 4.4, 4.6, 4.7, 4.12, 4.15, 4.19
        • Reading: Text Chapter 4, Algorithms for Unconstrained Optimization, Sections 4.1 to 4.5.
    7. Unconstrained and Constrained Optimization
      • HOMEWORK:
        • due: Thursday, Mar. 31, 2005
          • Go to the NEOS Solvers web page. Use the unconstrained optimization package CGplus to solve the extended Rosenbrock function
            f(x)= &Sigma i=1 to n/2 [ &alpha ( x2i-(x2i-1)2)2+ ( 1-x2i-1)2 ]
            (see the sample submission for CGplus) with n=48 variables, parameter values alpha=10 and alpha=250, and method=2 (Polak-Ribiere).
            Then solve the problem using the unconstrained optimization package NMTR.
            Hand in a copy of the outputs you get by email and send a email copy (e.g. tarfile) to the instructor. In each case, carefully point out the solution and the optimal value.
          • Let &theta be a fixed parameter, 0 &le &theta &le &pi/2, and consider the linear program:
            maximize   (cos &theta)x1 + (sin &theta)x2
            subject to             x1 &le 1
                                   x2 &le 1
                                   x1 , x2 &ge 0
            
            Compute an explicit formula for the central path (for the primal and dual variables and the primal and dual slack variables). And, evaluate the limit of x&mu first as &mu goes to 0 and second as &mu goes to infinity. (Problem Reference: Linear Programming: Foundations and Extensions (by Robert Vanderbei), Page 266.)
          • BONUS Exercises: 5.1, 5.2, 5.4.
        • Reading: Text Chapter 5, Section 5.1, The Reduced Gradient Method
          and
          Chapter 6, The Interior Point Approach to Nonlinear Optimization, Sections 6.1,6.2,6.3.1,6.3.2


    This page is maintained by Henry Wolkowicz,
     


    Faculty of Mathematics, University of Waterloo | 200 University Ave. W. | Waterloo, Ontario Canada | N2L 3G1 | 519.885.1211 | www.math.uwaterloo.ca