Winter Term 2019-20 / Neural Inf Process



Course title

Machine Learning I + Exercises

Course content / topics

Machine Learning is concerned with developing and studying algorithms that learn structure from data. It provides both theoretical concepts for understanding how sensory systems can infer structure from empirical observations as well as practical tools for data analysis. Both sides are stressed equally in this course. The course introduces concepts and algorithms in machine learning with a focus on algorithms that have a statistical (and often Bayesian) interpretation. Besides theory, the course also introduces students to the practical side of machine learning through worked examples programming exercises

We will cover only supervised algorithms which try to learn an association between inputs and desired outputs (unsupervised algorithms are covered in the Machine Learning II course). This will include linear and nonlinear regression algorithms as well as linear discriminants, logistic regression, nonlinear classification algorithms and neural networks.

Course Schedule & Topics

After this course, students should be able to derive some basic machine techniques using e.g. maximum likelihood or maximum aposteriori estimation. Also, students learn how to use software tools to analyse small data sets.

Prerequisites Students should have a basic knowledge of linear algebra and probability theory. Some exercise-sheets will involve programming. Preferred languages are Python or R. Matlab is OK but discouraged. Suggested reading Bishop Pattern Recognition and Machine Learning, 2006.


Hastie, Tibshirani & Friedman The Elements of Statistical Learning, 2009.

Kuhn & Johnson, Applied Predictive Modeling, 2013.

Day, time & location

Tue, 2-4 pm, GTC Lecture Hall