MGT-448 / 4 credits

Teacher: Kiyavash Negar

Language: English


Summary

This course aims to provide graduate students a thorough grounding in the methods, theory, mathematics and algorithms needed to do research and applications in machine learning. The course covers topics from machine learning, classical statistics, and data mining.

Content

List of topics:

  • General Introduction
  • Supervised Learning, Discriminative Algorithms:
    Supervised Learning Concept, Linear Regression, Maximum Likelihood, Normal Equation Gradient Descent, Stochastic Gradient, SVRG.
    Linear Classification, Logistic Regression, Newton Method,
  • Generative Algorithms:
    Multivariate Normal, Linear Discriminant Analysis
    Naive Bayes, Laplacian Smoothing
    Multiclass Classification, K-NN
    Multi-class Fisher Discriminant Analysis, Multinomial Regression
    Support Vector Machines and Kernel Methods:
    Intuition, Geometric Margins, Optimal Margin Classifier
    Lagrangian Duality, Soft-margin, Loss function, Stochastic Subgradient Method. Kernel, SMO algorithm, Coordinate Gradient Descent.
    Kernel PCA, Kernel Logistic Regression, Kernel Ridge Regression, Multiclass SVM
  • Unsupervised Learning:
    PCA, Mixture Models, Bayesian Graphical Models
    Power Method, Oja’s algorithm, EM Algorithm, Variational Inference Matrix Factorization/Completion
  • Regularization and Model Selection:
    Cross Validation, Hill Climbing, Bayesian Optimization Bayesian Regression, Bayesian Logistic Regression
    Forward and Backward Regression, Lasso, elastic-net. Proximal Gradient, Prox-SVRG.
    Coordinate Proximal Gradient, Pathwise Coordinate Descent
  • Decision Tree and Random Forest:
    Entropy, Building Tree
    Bagging features, Bagging Samples, Random Forest Adaboost, Gradient Tree Boosting
  • Neural Network:
    Concept; Deep Neural Network; Backpropagation Convolutional Neural Network;

Keywords

Supervised and unsupervised learning, Model selection, Generative models.

Learning Prerequisites

Required courses

A course in basic probability theory.

Recommended courses

linear algebra and statistics.

Important concepts to start the course

Students should be familiar with basic concepts of probability theory, calculus and linear algebra.

Learning Outcomes

By the end of the course, the student must be able to:

  • Formalize Formulate supervised and unsupervised learning problems and apply it to data.
  • Understand and apply generative models.
  • Understand and train basic neural networks and apply them to data.

Transversal skills

  • Assess one's own level of skill acquisition, and plan their on-going learning goals.

Teaching methods

 

Classical formal teaching interlaced with practical exercices.

Expected student activities

Active participation in exercise sessions is essential.

Assessment methods

30% Homework

20% Midterm project

50% Final project

 

Supervision

Office hours Yes
Assistants Yes
Forum No

Resources

Moodle Link

In the programs

  • Semester: Fall
  • Exam form: Written (winter session)
  • Subject examined: Statistical inference and machine learning
  • Lecture: 2 Hour(s) per week x 14 weeks
  • Exercises: 2 Hour(s) per week x 14 weeks
  • Type: optional
  • Semester: Fall
  • Exam form: Written (winter session)
  • Subject examined: Statistical inference and machine learning
  • Lecture: 2 Hour(s) per week x 14 weeks
  • Exercises: 2 Hour(s) per week x 14 weeks
  • Type: optional
  • Semester: Fall
  • Exam form: Written (winter session)
  • Subject examined: Statistical inference and machine learning
  • Lecture: 2 Hour(s) per week x 14 weeks
  • Exercises: 2 Hour(s) per week x 14 weeks
  • Type: optional
  • Semester: Fall
  • Exam form: Written (winter session)
  • Subject examined: Statistical inference and machine learning
  • Lecture: 2 Hour(s) per week x 14 weeks
  • Exercises: 2 Hour(s) per week x 14 weeks
  • Type: optional
  • Semester: Fall
  • Exam form: Written (winter session)
  • Subject examined: Statistical inference and machine learning
  • Lecture: 2 Hour(s) per week x 14 weeks
  • Exercises: 2 Hour(s) per week x 14 weeks
  • Type: optional
  • Semester: Fall
  • Exam form: Written (winter session)
  • Subject examined: Statistical inference and machine learning
  • Lecture: 2 Hour(s) per week x 14 weeks
  • Exercises: 2 Hour(s) per week x 14 weeks
  • Type: optional
  • Exam form: Written (winter session)
  • Subject examined: Statistical inference and machine learning
  • Lecture: 2 Hour(s) per week x 14 weeks
  • Exercises: 2 Hour(s) per week x 14 weeks
  • Type: optional
  • Semester: Fall
  • Exam form: Written (winter session)
  • Subject examined: Statistical inference and machine learning
  • Lecture: 2 Hour(s) per week x 14 weeks
  • Exercises: 2 Hour(s) per week x 14 weeks
  • Type: optional
  • Semester: Fall
  • Exam form: Written (winter session)
  • Subject examined: Statistical inference and machine learning
  • Lecture: 2 Hour(s) per week x 14 weeks
  • Exercises: 2 Hour(s) per week x 14 weeks
  • Type: optional

Reference week

Related courses

Results from graphsearch.epfl.ch.