Spring 2019 / DM873
Deep Learning

General Information

Machine learning has become a part in our everydays life, from simple product recommendations to personal electronic assistant to self-driving cars. More recently, through the advent of potent hardware and cheap computational power, “Deep Learning” has become a popular and powerful tool for learning from complex, large-scale data.

In this course, we will discuss the fundamentals of deep learning and its application to various different fields. We will learn about the power but also the limitations of these deep neural networks. At the end of the course, the students will have significant familiarity with the subject and will be able to apply the learned techniques to a broad range of different fields.

Mainly, the following topics will be covered:

  • feedforward neural networks
  • recurrent neural networks
  • convolutional neural networks
  • backpropagation algorithm
  • regularization
  • factor analysis


# Date Content Slides Comments
1 Mon, 11.02.2019 Introduction here
2 Tue, 12.02.2019 Linear Algebra & Statistics here Chapters 2 & 3
3 Tue, 19.02.2019 Basics of Statistical Learning here Chapter 5
4 Mon, 25.02.2019 OLS & Logistic Regression here Chapter 3 & Chapter 4.3 in ISL
5 Tue, 26.02.2019 Feed Forward Networks here Chapter 6
6 Tue, 05.03.2019 Gradient Based Learning here
7 Mon, 11.03.2019 Back Propagation here
8 Tue, 12.03.2019 CNN here
9 Tue, 19.03.2019 Regularization here
10 Mon, 25.03.2019 Recurrent Neural Networks (updated 26.03.19) here
11 Tue, 26.03.2019 Optimization for Neural Networks here
12 Tue, 02.04.2019 Continuation from last week
13 Mon, 08.04.2019 Autoencoders & PhD Student Talks here
14 Tue, 09.04.2019 PhD Student Talks
15 Tue, 23.04.2019 Generative Models here
16 Tue, 14.05.2019 Recap Session


# Date Questions Download Solutions
1 Mon, 18.02.2019 CANCELLED
2 Wed, 27.02.2019 Regression & Introduction to KERAS Questions
3 Wed, 13.03.2019 Feed Forward Loops Questions
Keras Intro
4 Mon, 18.03.2019 CNN Questions
5 Mon, 01.04.2019 RNN Questions
6 Wed, 10.04.2019 Presentation of Project
7 Tue, 30.04.2019 Project Part 1
8 Tue, 21.05.2019 Project Part 2

PhD Student Talks

Monday, 8th of April

  • Golizheh Mehrooz:
    The One Hundred Layers Tiramisu: Fully Convolutional Dense Nets for Semantic Segmentation
  • Niclas Andersen:

Tuesday, 9th of April

  • Dominika Roszkowska:
    A parameter-efficient deep learning approach to predict conversion from mild cognitive impairment to Alzheimer’s Disease
  • Anne Hartebrodt:
    Differential Privacy (in deep learning)
  • Tobias Frisch:
    Privacy-Preserving/Collaborative Deep Learning
  • Philipp Weber:
    AlphaStar - Lamarckian evolution meets multi agent reinforcement learning


Aim of the Project

In this assignment you will be taking a look at images of butterflies and their specific butterfly families. Butterflies from different families look different, but even butterflies from the same family can look very different. This is a case of having a high intra-class variance. You will be creating networks to differentiate between these butterflies and classify the specific family of a butterfly, but it is not an easy task.

Further Information

You will find all necessary informaion in the project description. It is fundamentally important that you attend the exercises to discuss your progrss and receive feedback on your timly hand-ins.


  • Task 1:
    27th of April
  • Task 2:
    17th of May
  • Task 3 & the final deadline:
    Tuesday the 4th of June 2019 at 23:59 for all 3 tasks.


Procedure of the oral exam

The exam will last about 15-20 minutes. At the beginning, one topic from the list below will be drawn randomly. For each topic the examinee should be prepared to make a short presentation of 5 minutes. It is allowed to bring one page of hand-written notes (DIN A4 or US-Letter, one-sided) for each of the topics. The examinee will have 2 minutes to briefly study the notes for the drawn topic before the presentation. The notes may be consulted during the presentation if needed but it will negatively influence the evaluation of the examinee's performance. During the presentation, only the blackboard can be used (you cannot use overhead transparencies, for instance).

After the short presentation, additional question about the presentation's topic but also about other topics in the curriculum will be asked.

Below is the list of possible topics and some suggested content. The listed content are only suggestions and is not necessarily complete nor must everything be covered in the short presentation. It is the responsibility of the examinee to gather and select among all relevant information for each topic from the course material. On the course website you can find suggested readings for each of these topics.

Topics for the Oral Exam:

  1. Feed-Forward Networks
    • Function Principle
    • Output Units
    • Hidden Units
    • ...
  2. Backpropagation
    • Function Principle
    • Computational Graphs
    • Backpropagation through time
    • ...
  3. Regularization
    • Over/Underfitting & Model Capacity
    • Parameter Penalties
    • Bagging
    • Dropout
    • ...
  4. Convolutional Neural Networks
    • Function Principle
    • Pooling
    • Initialization of the kernels
    • ...
  5. Recurrent Neural Networks
    • Function Principle
    • Problems with long term memory
    • Long Short Term Memory
    • ...
  6. Optimization for Neural Networks
    • Parameter Initialization
    • Adaptive Learning
    • Batch Normalization
    • Pre-training
    • ...
  7. Autoencoders and GANs
    • Autoencoders
    • Variational Autoencoders
    • GANs
    • ...