Deep Learning 2019/2020 (QHD 1920 - Video & Folien)
Prof. Dr. Andreas Maier
1
Deep Learning (DL) has attracted much interest in a wide range of applications such as image recognition, speech recognition and artificial intelligence, both from academia and industry. This lecture introduces the core elements of neural networks and deep learning, it comprises: (multilayer) perceptron, backpropagation, fully connected neural networks loss functions and optimization strategies convolutional neural networks (CNNs) activation functions regularization strategies common practices for training and evaluating neural networks visualization of networks and results common architectures, such as LeNet, Alexnet, VGG, GoogleNet recurrent neural networks (RNN, TBPTT, LSTM, GRU) deep reinforcement learning unsupervised learning (autoencoder, RBM, DBM, VAE) generative adversarial networks (GANs) weakly supervised learning applications of deep learning (segmentation, object detection, speech recognition, ...) The accompanying exercises will provide a deeper understanding of the workings and architecture of neural networks.
https://itunes.video.uni-erlangen.de/course/itunesu/849/QHD/COMBINED