MATH-329 / 5 credits

Teacher: Boumal Nicolas

Language: English


Summary

This course introduces students to continuous, nonlinear optimization. We study the theory of optimization with continuous variables (with full proofs), and we analyze and implement important algorithms to solve constrained and unconstrained problems.

Content

Unconstrained optimization of differentiable functions

  • Necessary optimality conditions
  • The role of Lipschitz assumptions
  • Gradient descent and Newton's method
  • Convexity
  • The trust-regions method, CG, truncated CG
  • Nonlinear least-squares

Constrained optimization of differentiable functions

  • Necessary optimality conditions, cones
  • Convexity
  • Projected gradient descent
  • Notions of duality
  • The quadratic penalty method
  • The augmented Lagrangian method

Related topics and extensions may be included in lectures or though exercises / homework.

Note: precise contents may change during the semester, and from year to year.

 

Learning Prerequisites

Required courses

Students are expected to be comfortable with linear algebra, analysis and mathematical proofs. Lectures, homework and the final exam are proof heavy.

Students are expected to be (or become) comfortable writing code in Matlab. They may be allowed to write some of their work in Python or Julia upon request. Homework requires a substantial amount of coding. We will not teach Matlab but there are a lot of resources online to help you.

 

Learning Outcomes

By the end of the course, the student must be able to:

  • Recognize and formulate a mathematical optimization problem
  • Analyze and implement the gradient descent method, Newton's method, the trust-region method and the augmented Lagrangian method, among others.
  • Establish and discuss local and global convergence guarantees for iterative algorithms.
  • Exploit elementary notions of convexity and duality in optimization.
  • Apply the general theory to particular cases.
  • Prove some of the most important theorems studied in class.

Teaching methods

Lectures + exercise sessions + extensive homework (in groups)

Expected student activities

Students are expected to attend lectures and participate actively in class and exercises. Exercises include both theoretical work and programming assignments. Students also complete homework assignments that include theoretical and numerical work. The homework assignments require a substantial amount of work throughout the semester, and accordingly account for a substantial part of the final grade. They are done in groups.

Assessment methods

Final exam (40%) + homework (60%) -- the split may change depending on how many TAs are available for the course. This will be fixed during the first week.

 

The overall grade (computed as above) is rounded up or down to the closest quarter of a point: up if the (individual) exam grade is a passing grade, down if not.

Supervision

Office hours No
Assistants Yes

Resources

Bibliography

Book "Numerical Optimization", J. Nocedal and S. Wright, Springer 2006: https://link.springer.com/book/10.1007/978-0-387-40065-5

Ressources en bibliothèque

Notes/Handbook

Lecture notes provided by the lecturer: https://www.nicolasboumal.net/papers/MATH329-Lecture_notes_Boumal_2023.htm

Moodle Link

In the programs

  • Semester: Fall
  • Exam form: Written (winter session)
  • Subject examined: Continuous optimization
  • Courses: 2 Hour(s) per week x 14 weeks
  • Exercises: 2 Hour(s) per week x 14 weeks
  • Type: optional

Reference week

Wednesday, 10h - 12h: Lecture MAB111

Wednesday, 13h - 15h: Exercise, TP MAA112

Related courses

Results from graphsearch.epfl.ch.