PORTALE DELLA DIDATTICA

PORTALE DELLA DIDATTICA

PORTALE DELLA DIDATTICA

Elenco notifiche



Advanced deep Learning (didattica di eccellenza)

01UMNRV

A.A. 2020/21

Course Language

Inglese

Degree programme(s)

Doctorate Research in Ingegneria Elettrica, Elettronica E Delle Comunicazioni - Torino

Course structure
Teaching Hours
Lezioni 30
Lecturers
Teacher Status SSD h.Les h.Ex h.Lab h.Tut Years teaching
Pasero Eros Gian Alessandro Professore Associato IINF-01/A 2 0 0 0 2
Co-lectures
Espandi

Context
SSD CFU Activities Area context
*** N/A ***    
The course is organized in three parts. The first one introduces the concepts, explained in the previous course on deep learning, in such a way this course can be attended even from students with no prerequisites in neural networks.. The second one describes the sequence modeling, which is very important when time is taken into account, like in natural language processing and forecasting. The final parts describe more speculative ideas in deep learning.
PERIOD: - MARCH Prof. Giansalvo Cirrincione The course is organized in three parts. The first one introduces the concepts, explained in the previous course on deep learning, in such a way this course can be attended even from students with no prerequisites in neural networks.. The second one describes the sequence modeling, which is very important when time is taken into account, like in natural language processing and forecasting. The final parts describe more speculative ideas in deep learning.
.
.
This course is the continuation of the previous course on deep learning, which dealt with convolutional neural networks (CNN). However, this is a self-contained course and does not require any knowledge on neural networks. Here, at first, the analysis of sequences is detailed, and, for this aim, the recurrent neural networks (RNN) are introduced. It follows the study of the generative adversarial networks (GAN), which are actually the most famous neural algorithms. The third part is dedicated to Deep Reinforcement Learning (DRL), whose applications are well known: AlphaZero and AlphaGo for chess and Go, self-driving cars (Tesla), and so on. The last lesson is dedicated to the graph neural networks, which work on structured relational data (graphs)..
This course is the continuation of the previous course on deep learning, which dealt with convolutional neural networks (CNN). However, this is a self-contained course and does not require any knowledge on neural networks. Here, at first, the analysis of sequences is detailed, and, for this aim, the recurrent neural networks (RNN) are introduced. It follows the study of the generative adversarial networks (GAN), which are actually the most famous neural algorithms. The third part is dedicated to Deep Reinforcement Learning (DRL), whose applications are well known: AlphaZero and AlphaGo for chess and Go, self-driving cars (Tesla), and so on. The last lesson is dedicated to the graph neural networks, which work on structured relational data (graphs).
A distanza in modalità sincrona
On line synchronous mode
Presentazione orale
Oral presentation
P.D.2-2 - Marzo
P.D.2-2 - March
CALENDAR: 1. Tuesday, 9 March 9:30-12:30 Introduction to neural networks i. Definitions ii. Multilayer Perceptron iii. Gradient based methods iv. Generalization v. Regularization vi. Data preprocessing vii. Batch normalization viii. Transfer learning 2. Wednesday, 10 March 14:30-17:30 Computational graphs and backpropagation i. Autograd ii. Tensors 3. Tuesday, 16 March 9:30-12:30 Convolutional neural networks 1 i. Basic ideas ii. 1-d CNN 4. Wednesday, 17 March 14:30-17:30 Convolutional neural networks 2 i. LeNet-5 ii. AlexNet iii. ZFNet iv. VGGNet v. GoogLeNet vi. ResNet vii. Network in Network viii. FractalNet ix. SqueezeNet 5. Tuesday, 23 March 9:30-12:30 Recurrent neural networks 1 i. Vanilla RNN ii. RNN computational graphs iii. Language model iv. Interpreting cells v. RNN movies vi. Backpropagation through time 6. Wednesday, 24 March 14:30-17:30 Recurrent neural networks 2 i. LSTM units ii. GRU units iii. Deep RNN iv. Bidirectional RNN v. Image captioning vi. Blood pressure prediction 7. Tuesday, 30 March 15:00-18:00 Transformers 1 i. Neural machine translation ii. Attention iii. Self-attention iv. Transformer encoder 8. Wednesday, 31 March 9:00-12:00 Transformers 2 i. Transformer decoder ii. Influenza time series forecasting iii. BERT 9. Monday, 12 April 15-18 Generative Adversarial Networks 1 i. Basic ideas ii. Mathematical theory iii. Wasserstein GAN 10. Tuesday, 13 April 9:30-12:30 Generative Adversarial Networks 2 i. DCGAN ii. Semi-supervised GAN iii. BERT GAN iv. Conditional GAN v. CycleGAN vi. Application to medicine
CALENDAR: 1. Tuesday, 9 March 9:30-12:30 Introduction to neural networks i. Definitions ii. Multilayer Perceptron iii. Gradient based methods iv. Generalization v. Regularization vi. Data preprocessing vii. Batch normalization viii. Transfer learning 2. Wednesday, 10 March 14:30-17:30 Computational graphs and backpropagation i. Autograd ii. Tensors 3. Tuesday, 16 March 9:30-12:30 Convolutional neural networks 1 i. Basic ideas ii. 1-d CNN 4. Wednesday, 17 March 14:30-17:30 Convolutional neural networks 2 i. LeNet-5 ii. AlexNet iii. ZFNet iv. VGGNet v. GoogLeNet vi. ResNet vii. Network in Network viii. FractalNet ix. SqueezeNet 5. Tuesday, 23 March 9:30-12:30 Recurrent neural networks 1 i. Vanilla RNN ii. RNN computational graphs iii. Language model iv. Interpreting cells v. RNN movies vi. Backpropagation through time 6. Wednesday, 24 March 14:30-17:30 Recurrent neural networks 2 i. LSTM units ii. GRU units iii. Deep RNN iv. Bidirectional RNN v. Image captioning vi. Blood pressure prediction 7. Tuesday, 30 March 15:00-18:00 Transformers 1 i. Neural machine translation ii. Attention iii. Self-attention iv. Transformer encoder 8. Wednesday, 31 March 9:00-12:00 Transformers 2 i. Transformer decoder ii. Influenza time series forecasting iii. BERT 9. Monday, 12 April 15-18 Generative Adversarial Networks 1 i. Basic ideas ii. Mathematical theory iii. Wasserstein GAN 10. Tuesday, 13 April 9:30-12:30 Generative Adversarial Networks 2 i. DCGAN ii. Semi-supervised GAN iii. BERT GAN iv. Conditional GAN v. CycleGAN vi. Application to medicine