1.1 These definition questions were done well with only minor mistakes here and there. 1.2 Some students still seem to be confused about the difference between constrained and unconstrained optimization and were unsure about the difference between the necessary and sufficient optimality conditions in this question. 2.1 Most students attempted to show that the closest point was unique, but few explained clearly why f(x) is strictly convex. However, no one attempted to show the existence of such a closest point. 2.2 Many students either couldn't recall this theorem or gave the one from the textbook. 2.3 This question was not done very well, especially for those who could not recall the hyperplane separation theorem given in class. 2.4 Some students were able to find the closest point to w, but only a couple of students were then able to write down a separating hyperplane. 2.5 Many students did not attempt this question. Those who did were only able to give a vague and non-algebraic argument. 3.1 No problem here. 3.2 Some students said the active set was {g1(x),g2(x)}, when instead it should only be the indices, i.e. {1,2}. 3.3 Most students found the gradients of f(x), g1(x), and g2(x) and used them to check the KKT necessary conditions, but were not able to explain why they did not hold. Some concluded that xbar = (1,14)^T was not optimal, others did not explain why at all. Only a couple of students realized that the feasible set only contained the point (1,14)^T, but did not explain that this meant the problem was not Slater regular. 3.4 Again, most students were able to find that the KKT necessary conditions do hold. Some were also able to show that this new problem is a convex optimization problem (by showing that the new g1(x) was convex). However, no one explicitly stated that the KKT necessary conditions are sufficient for a convex optimization problem, allowing them to conclude that xbar = (1,14)^T is a global minimum.