Theoretical advances in deep learning (IN2107, IN4409)
Course Criteria and Registration
The pre-course meeting will be held on Jun 27, 2024, 16:00 online. You can find the slides here.
Deadline for deregistration: 01.11.2024
Content
Neural networks, particularly deep networks, have achieved unprecedented popularity over the past decade. While the empirical success of neural networks has reached new heights, one of the major achievements in recent years has been new theoretical studies on the statistical performance of neural networks. This seminar will look at the following important topics on neural networks from a mathematical perspective:
- Generalization error for neural networks and related concepts from learning theory
- Optimization and convergence rates for neural networks
- Sample complexity and hardness results
- Connection of deep learning to other learning approaches (kernel methods etc)
- Robustness of neural networks
Several recent papers from top machine learning conferences will be discussed during the seminar.
Previous Knowledge Expected
- Machine learning (IN2064)
- Introduction to deep learning (IN2346)
Objective
Upon completion of this module, the students will:
- have acquired knowledge on the current trends in deep learning theory.
- be familiar with recent theoretical works from top machine learning conferences.
- be able to apply mathematical tools to analyze performance of neural networks.
Details
Each student will be allotted a research paper. The student will have to submit a 3-4 page report/review on the paper (submission deadline in the middle of the semester). Additionally, there will be presentations will be held together as a block seminar. The slides have to be submitted before the presentation. The final grades will depend on the presentation (60%) and a written report (40%).
Further Information
TBA