MATH-685 / 2 credits
Remark: Fall semester Thursdays from 14:15 to 16:15 - The course will be mathematically rigorous. The goal is to make things as comprehensive as possible.
Only this year
This course is intended to give a brief overview of how to prove consistency results in nonparametric regression. In particular, we will focus on least-square regression estimators. Some connections to the empirical risk minimization (ERM) problem will be discussed from time to time.
The objective of this course is to introduce the students to some of the widely used mathematical notions in nonparametric regression and unclutter the concepts like empirical process theory, metric entropy, concentration inequalities, etc. After the course the student should be able to understand why and how these concepts are used in proving consistency and deriving rates of convergence in nonparametric regression. They should also be able to use/adapt these methods in their own problems.
1. Introduction to nonparametric regression. The class of least-square estimators.
2. Empirical risk minimization (ERM) and its connection to least-squares. Some related problems: classification, density estimation.
3. The bias-variance decomposition.
4. Tools from probability: basic and advanced tail bounds.
5. Metric entropy.
6. How to show consistency.
7. How to derive the rate of convergence.
8. Some advanced tools from empirical process theory.
Throughout the course, particular focus will be given to the class of least-square regression estimators. Also, connections to the general ERM problem will be made from time to time.
Basic knowledge of probability and statistics. Familiarity with asymptotic theory will be assumed. It is recommended to be familiar with the basics of regression.
In the programs
- Number of places: 20
- Exam form: Oral presentation (session free)
- Subject examined: Learning Theory of Nonparametric Regression
- Lecture: 24 Hour(s)