Statistical Machine Learning

B232 - Summer 23/24
This course is not present in Moodle. You can visit its homepage by clicking the "Course page (outside Moodle)" button on the right (if available).

Statistical Machine Learning - BE4M33SSU

Credits 6
Semesters Winter
Completion Assessment + Examination
Language of teaching English
Extent of teaching 2P+2C
Annotation
The aim of statistical machine learning is to develop systems (models and algorithms) for learning to solve tasks given a set of examples and some prior knowledge about the task. This includes typical tasks in speech and image recognition. The course has the following two main objectives
1. to present fundamental learning concepts such as risk minimisation, maximum likelihood estimation and Bayesian learning including their theoretical aspects,
2. to consider important state-of-the-art models for classification and regression and to show how they can be learned by those concepts.
Study targets
The aim of statistical machine learning is to develop systems (models and algorithms) for learning to solve tasks given a set of examples and some prior knowledge about the task.
Course outlines
The course will cover the following topics
- Empirical risk minimization, consistency, bounds
- Maximum Likelihood estimators and their properties
- Unsupervised learning, EM algorithm, mixture models
- Bayesian learning
- Deep (convolutional) networks
- Supervised learning for deep networks
- Hidden Markov models
- Structured output SVMs
- Ensemble learning, random forests
Exercises outlines
Labs will be dedicated to practical implementations of selected methods discussed in the course as well as seminar classes with task-oriented assignments.
Literature
1. M. Mohri, A. Rostamizadeh and A. Talwalkar, Foundations of Machine Learning, MIT Press, 2012
2. K.P. Murphy, Machine Learning: A Probabilistic Perspective, MIT Press, 2012
3. T. Hastie, R. Tibshirani and J. Friedman, The Elements of Statistical Learning, Springer, 2010
4. I. Goodfellow, Y. Bengio and A. Courville, Deep Learning, MIT Press, 2016
Requirements
Prerequisites of the course are:
- foundations of probability theory and statistics comparable to the scope of the course "Probability, statistics and information theory" (A0B01PSI),
- knowledge of statistical decision theory foundations, canonical and advanced classifiers as well as basics of machine learning comparable to the scope of the course "Pattern Recognition and Machine Learning" (AE4B33RPZ)