Convex Optimization
This is a webpage for 2010 course at the Weizmann Institute.
Mondays and Wednesdays 10:00-12:00, February 22nd through March 10th, 10:00-12:00, Ziskind 1
Mondays 9:00-11:00 at Ziskind 286, Wednesdays 9:00-11:00 at Ziskind 1, March 15th through 24th
Final exam: April 14th 10am
Lecturer: Nati Srebro, TTI-Chicago.
Course Description
The course will cover techniques in unconstrained and constrained
convex optimization and a practical introduction to convex duality.
The course will focus on (1) formulating and understanding convex
optimization problems and studying their properties; (2) presenting
and understanding optimization approaches; and (3) understanding the
dual problem. Limited theoretical analysis of convergence properties
of methods will be presented. Examples will be mostly from data
fitting and machine learning.
Specific Topics:
- Formalization of optimization problems
- Smooth unconstrained optimization: Gradient Descent, Conjugate Gradient Descent, Newton and Quasi-Newton methods; Line Search methods
- Standard formulations of constrained optimization: Linear, Quadratic and Semi-Definite Programming
- KKT optimality conditions
- Lagrangian duality, constraint qualification, weak and strong duality
- Fenchel conjugacy and its relationship to Lagrangian duality
- Equality constrained Newton method
- Log Barrier (Central Path) methods, and Primal-Dual optimization methods
Text Books
The required textbook for the class is:
The book is available online here. About 80% of the material covered
in the class can be found in the above book.
Supplemental recommended books:
In particular, additional material on unconstrained optimization
techniques, not covered by Boyd and Vandenberghe, can be found in the
first two of the above three books.
Requirements and Grading:
There will be three homework assignments, counting toward 40%
of the grade (13% each). Assignments must be typed (not
handwritten) and submitted electronically in PDF.
The remaining 60% of the grade will be based on a final exam.
Students will also be expected to read reading assignments from
Boyd and Vandenberghe supplementing and providing background to topics
discussed in class. The material from these reading assignments will
be covered by the problem sets and in the exam, and will be assumed in
lectures.
Lectures and Required Reading:
- Monday, February 22nd
-
- Convex and Non-Convex Optimization Problems
- Feasibility, Optimality, Sub-optimality, Local Optimality
- Unconstrained Optimization: Optimality Condition
- Descent Direction, Gradient Descent, Line Search
Sections 1.1-1.4,4.1-4.2,9.1-9.3
Required Reading: Sections 2.1-2.3,2.5,3.1-3.2
- Wednesday, February 24nd
-
Assignment 1 out.
- Analysis of Gradient Descent with Exact and Backtracking line
search.
- Newton's Method
- Monday, March 1st
-
Optional enrichment problem set 1 ½ available.
- Self-conconrdant analysis (Section 9.6).
- Conjugate Gradient Descent and Quasi-Newton Methods (Bertsekas Sections 1.6-1.7; optional enrichment problem set).
- Summary of unconstrained optimization.
- Constrained optimization: formulation and optimality condition (Section 4.2.3).
- Wednesday, March 3rd
-
- Linear Programming (Section 4.3)
- The Lagrangian, the dual problem, weak duality, Slater's condition and strong duality (Sections 5.1, 5.2, 5.4, 5.5.1)
- The dual of a non-convex problem; The dual of a Linear Programming problem.
- Monday, March 8th
-
- Geometric interpretation of duality and Slater's condition (Section 5.3)
- Using duality to solve equality constrained quadratic programming.
- Complimentary slackness (5.5.2).
- KKT conditions (5.5.3).
- Wednesday, March 10th
-
Assignment 2 Out
Reading before lecture: Sections 3.3, 2.4.1, 2.6.1, 2.6.2
- Representing the Dual in terms of the Fenchel Conjugate of the objective (3.3, 5.1.6, 5.7.1).
- Minimum volume covering elipsoide and the log-det function.
- Generalized inequality constraints: Positive Semi-Definite constraints and Semi-Definite Programming (4.6).
- Monday, March 15th
Assignment 1 Due
Assignment 2 Out
-
- Examples of problems with semi-definite constraints; Semi-Definite Programing (4.6).
- Dualilty and optimality conditions for semi-definite constraints (5.9).
- Wednesday, March 17th
-
- Norm-constrained matrix factorization as an example of an SDP and its dual.
- Equality constrained Newton's method (10.1-10.2).
- Log-barrier problems and the central path interior point method (11.1, 11.2, 11.3.1).
- Monday, March 22nd
-
- Analysis of the log-barrier central path method (11.5).
- Feasibility problems and Phase I methods (11.4.1, 11.4.3).
- Reducing Optimization to feasibility.
- Log-barrier method for semi-definite constraints (11.6).
- Log-barrier and weakened KKT conditions (10.3.4).
- Primal-Dual interior point methods, including infeasible starts (11.7).
- Wednesday, March 24th
Assignment 3 Out
-
- The simplex method as an example of an active set method (Further reading: Nocedal and Wright Chapter 13; Spielman and Teng's paper on smoothed analysis of the simplex method).
- First order methods: subgradients and separation oracles, center-of-mass method, oracle lower bound, the Ellipsoid method (Chapters 1-3 of Nemirovski's lecture notes on "Efficient Methods in Convex Programming")
- Subgradient descent, smoothed subgradient descent, bundle methods, and stochastic subgradient descent, for very large scale problems (Chapters 7,8,10 and 4 of Nemirovski's lecture notes on "Efficient Methods in Convex Programming", as well as Chapter 5 of Nemirovski's lecture notes on "Modern Convex Optimization")
Assignments:
- Assignment 1: Convexity and Unconstrained Optimization (Note corrected definition of Strong Convexity in Problem 3; "½" instead of "2" in Problem 6(a); and additional hint in Problem 4(c))
- Optional Enrichment Assignment 1: Conjugate Gradient Descent and Quasi-Newton Methods.
-
- Assignment 2: Duality (Updated March 24th: correction to 1(b))
- Assignment 3: Constrained Optimization
-
Last modified: Wed Mar 24 16:40:05 Jerusalem Standard Time 2010