Optimal Control for Dynamic Systems
EE-736 / 3 crédits
Enseignant(s): Faulwasser Timm, Jiang Yuning
Langue: Anglais
Remark: Next time: Spring 2026
Frequency
Every 2 years
Summary
This doctoral course provides an introduction to optimal control covering fundamental theory, numerical implementation and problem formulation for applications.
Content
- Recap of finite dimensional optimization and numerical methods for optimization
- Fundamentals of Caculus of variations and optimization in function spaces
- Closed-loop and open loop optimal control
- Calculus of variations and optimal control
- Pontryagin's Maximum Principle
- Numerical optimal control
- Singular problems and minimum time control
- Dissipativity and optimal control
- Hamilton-Jacobi-Bellman equations
- Sampled-data predictive control
- Research outlook
- Exercises: pen and paper, programming; depending on the individual knowledge of the students
Learning Outcomes
By the end of the course, the student must be able to:
- Solve control problems arising in their research projects by means of optimal control approaches.
Assessment methods
Oral presentation.
Resources
Bibliography
- LIBERZON, Daniel. Calculus of variations and optimal control theory: a concise introduction. Princeton university press, 2011
Moodle Link
Dans les plans d'études
- Nombre de places: 30
- Forme de l'examen: Exposé (session libre)
- Matière examinée: Optimal Control for Dynamic Systems
- Cours: 32 Heure(s)
- Exercices: 12 Heure(s)
- Type: optionnel
- Nombre de places: 30
- Forme de l'examen: Exposé (session libre)
- Matière examinée: Optimal Control for Dynamic Systems
- Cours: 32 Heure(s)
- Exercices: 12 Heure(s)
- Type: obligatoire