Forschung

Winter Term 2017-18 / Neural Inf Process


 Downloads: 

 GUIDELINES  •  MODULE HANDBOOK 2017-18  •  IMPORTANT DATES  •  WEEK PLAN  •  EXAM SCHEDULE

Course title

Machine Learning I + Exercises

Lecturer
Dijkstra
Credits
5.0
Course content / topics

Objective
The scientific discipline of “Machine learning”is concerned with developing and studying algorithms which can learn structure from data. Thus, it both provides important practical tools for data analysis as well as theoretical concepts for understanding how sensory systems can infer structure from empirical observations.This course will provide an introduction to important topics and algorithms in machine learning. A particular focus of this course will be on algorithms that have a clear statistical (and often Bayesian) interpretation.

We will cover both supervised algorithms (i.e. which try to learn an association between inputs and desired outputs) as well as unsupervised algorithms (which try to build up an internal model from inputs alone). The “supervised” learning component of the course will include various linear and nonlinear regression algorithms as well as linear discriminants, logistic regression and nonlinear classification algorithms.The “unsupervised” learning component of the course will include fundamental concepts and algorithms of dimensionality reduction, blind source separation, and clustering.

Course Schedule & Topics

Learning targets
In this course, students will learn about important topics and techniques in machine learning, with a particular focus on probabilistic models. The course will cover supervised learning (linear regression algorithms, linear discriminants, logistic regression, nonlinear classification algorithms) and unsupervised learning (principal component analysis including several generalizations, k-means, mixture of Gaussians, Expectation-Maximization)

Prerequisites
Students should have a basic knowledge of linear algebra and probability theory. The exercise-sheets will involve some matlab-programming, so a basic familiarity with matlab would be advantageous.

Suggested reading
Christopher M. Bishop (2007) Pattern Recognition And Machine Learning , Springer.Trevor Hastie, Robert Tibshirani, Jerome Friedman (2009) The Elements of Statistical Learning, Springer.

Day, time & location

Tue, 2-4 pm, GTC Lecture Hall