Machine Learning
This award-winning introductory Machine Learning lecture teaches the foundations of and concepts behind a wide range of common machine learning models. It uses a combination of engaging lectures, challenging mathematical exercises, practically-oriented programming tasks, and insightful tutorials. The lecture was awarded with the TeachInf 2020 award.
The Machine Learning lecture for WS21/22 will again be held online. We will upload videos of lectures and tutorials, and provide pointers to other reference materials. Additionally, we will offer slots for online, live Q&A sessions (every Wednesday from 12 to 2pm).
Important: The first session, in which we will discuss organizational topics, will be held live on Wednesday October 20th at 1pm (not 12pm!).
A link to the zoom call is available on the Piazza forum and Moodle.
Please do not send any questions about organizational matters via e-mail. Use the Q&A session and, after that, the Piazza forum.
If you have problems accessing the Moodle course, contact jan.schuchardt [at] in.tum.de .
Tentative list of topics
- Introduction
- What is machine learning?
- Typical tasks in ML
- k-Nearest neighbors
- kNN for classification and regression
- Distance functions
- Curse of dimensionality
- Decision trees
- Constructing & pruning decision trees
- Basics of information theory
- Probabilistic inference
- Parameter estimation
- Maximum likelihood principle
- Maximum a posteriori
- Full Bayesian approach
- Linear regression
- Linear basis function models
- Overfitting
- Bias-variance tradeoff
- Model selection
- Regularization
- Linear classification
- Perceptron algorithm
- Generative / discriminative models for classification
- Linear discriminant analysis
- Logistic regression
- Optimization
- Gradient-based methods
- Convex optimization
- Stochastic gradient descent
- Deep learning
- Feedforward neural networks
- Backpropagation
- Structured data: CNNs, RNNs
- Training strategies
- Frameworks
- Advanced architectures
- Support vector machines
- Maximum margin classification
- Soft-margin SVM
- Kernel methods
- Kernel trick
- Kernelized linear regression
- Dimensionality reduction
- Principal component analysis
- Singular value decomposition
- Probabilistic PCA
- Matrix factorization
- Autoencoders
- Clustering
- k-means
- Gaussian mixture models
- EM algorithm
Literature
- Pattern Recognition and Machine Learning. Christopher Bishop. Springer-Verlag New York. 2006.
- Machine Learning: A Probabilistic Perspective. Kevin Murphy. MIT Press. 2012
Prerequisites
- Good understanding of Linear Algebra, Analysis, Probability and Statistics.
- Programming experience (preferably in Python).