# Statistical physics for optimization & learning

PHYS-642 / **4 credits**

**Teacher(s): ** Krzakala Florent Gérard, Zdeborová Lenka, Loureiro Bruno, Saglietti Luca

**Language:** English

**Remark:** Next time: Spring 2021

## Frequency

Every 2 years

## Summary

This course covers the statistical physics approach to computer science problems ranging from graph theory and constraint satisfaction to inference and machine learning. In particular the replica and cavity methods, message passings algorithms, and analysis of the related phase transitions.

## Content

Interest in the methods and concepts of statistical physics is rapidly growing in fields as diverse as theoretical computer science, probability theory, machine learning, discrete mathematics, optimization, signal processing and others.Â In the last decades, in particular, there has been increasing convergence of interest and methods between theoretical physics and much theoretical and applied work in statistical physics and computer science has relied on the use of message-passing algorithms and their connection to the statistical physics of glasses and spin glasses.

This course will cover this rich and active interdisciplinary research landscape. Specifically, we will review the statistical physics approach to problems ranging from graph theory (percolation, community detection) to discrete optimization and constraint satisfaction (satisfiability, coloring, bisection) and to inference and machine learning problems (learning in neural networks, clustering of data and of networks, compressed sensing or sparse linear regression, low-rank matrix and tensor factorization, etc.).

We will expose theoretical methods of analysis (replica, cavity, ...) algorithms (message passing, gradient descent, spectral methods, â€Š), discuss concrete applications, highlight rigorous justifications as well as present the connection to the physics of glassy and disordered systems.

The course is designed to be accessible to graduate students and researchers of all natural science, engineering and mathematics disciplines with a knowledge of basic concepts in probability and analysis. Advanced training in any of the above fields is not requisite.

## Note

**Website of the lecture: https://sphinxteam.github.io/EPFLDoctoralLecture2021/**

Mainly theory course, with exercises in the analytical methods and usage of the related algorithms.

Evaluation of the lecture based on homeworks given during the whole semester

## Keywords

Statistical physics, replica method, cavity method, neural networks, theory of machine learning, combinatorial optimization, community detection, graphical models, message passins algorithms.

## Learning Prerequisites

## Required courses

Basic probability and/or statistical physics

## Learning Outcomes

By the end of the course, the student must be able to:

- To study a range of problems in computer science and learning, and derive formulas and algorithms for their solution, using technics from statistical physics.

## In the programs

**Exam form:**During the semester (session free)**Subject examined:**Statistical physics for optimization & learning**Lecture:**28 Hour(s)**Exercises:**28 Hour(s)

**Exam form:**During the semester (session free)**Subject examined:**Statistical physics for optimization & learning**Lecture:**28 Hour(s)**Exercises:**28 Hour(s)

## Reference week

Mo | Tu | We | Th | Fr | |

8-9 | |||||

9-10 | |||||

10-11 | |||||

11-12 | |||||

12-13 | |||||

13-14 | |||||

14-15 | |||||

15-16 | |||||

16-17 | |||||

17-18 | |||||

18-19 | |||||

19-20 | |||||

20-21 | |||||

21-22 |