Master of science-level of the Bologna process in Ict For Smart Societies (Ict Per La Societa' Del Futuro) - Torino Master of science-level of the Bologna process in Ingegneria Civile - Torino Master of science-level of the Bologna process in Nanotechnologies For Icts (Nanotecnologie Per Le Ict) - Torino/Grenoble/Losanna Master of science-level of the Bologna process in Physics Of Complex Systems (Fisica Dei Sistemi Complessi) - Torino/Trieste/Parigi Master of science-level of the Bologna process in Civil Engineering - Torino
The medium of instruction is English.
The course introduces statistical and machine learning methods for processing data and signals in order to extract information from them. Such methods are based on models that are typically learned from training data. In particular, the course covers statistical methods, where the data are modelled as a set of measured (or acquired) samples of a random process, and neural networks, where the model is a suitable combination of basic units (neurons) and the model parameters are found through numerical optimization.
The first part of the course deals with statistical learning in maximum likelihood and Bayesian sense. The techniques presented during the course are applied to classification problems in several domains of ICT, including classification of continuous as well as discrete data, and dimensionality reduction using principal component analysis and kernel methods. In the second part of the course, feedforward and convolutional neural networks are introduced. Finally, statistical tracking filters are introduced. Throughout the course, the concepts above are illustrated with practical examples taken from ICT applications. Half of the course takes place in the LAIB laboratories, where students implement and assess the methods discussed during the lectures. Most of the activities are performed using Matlab and Python, whereas the TensorFlow framework (also based on Python) is employed for neural networks experiments. This course is designed jointly with the course “ICT for health”, with the objective to provide students with a coordinated “machine learning” approach that can be applied to several ICT problems; in particular, this course deals primarily with machine learning basics, classification and neural networks, while the “ICT for health” course addresses regression and clustering topics.
The medium of instruction is English.
The course introduces statistical and machine learning methods for processing data and signals in order to extract information from them. Such methods are based on models that are typically learned from training data. In particular, the course covers statistical methods, where the data are modelled as a set of measured (or acquired) samples of a random process, and neural networks, where the model is a suitable combination of basic units (neurons) and the model parameters are found through numerical optimization.
The first part of the course deals with statistical learning in maximum likelihood and Bayesian sense. The techniques presented during the course are applied to classification problems in several domains of ICT, including classification of continuous as well as discrete data, and dimensionality reduction using principal component analysis and kernel methods. In the second part of the course, feedforward and convolutional neural networks are introduced. Finally, statistical tracking filters are introduced. Throughout the course, the concepts above are illustrated with practical examples taken from ICT applications. A large part of the course takes place in the LAIB laboratories, where students implement and assess the methods discussed during the lectures. The lab activities are performed using the Python language, and possibly employing cloud-based environments such as Google CoLab. This course is designed jointly with the course “ICT for health”, with the objective to provide students with a coordinated “machine learning” approach that can be applied to several ICT problems; in particular, this course deals primarily with machine learning basics, classification and neural networks, while the “ICT for health” course addresses regression and clustering topics.
1. Knowledge of the basics of statistical learning.
2. Knowledge of the theory and methods of classification for continuous and discrete data using maximum likelihood and Bayesian methods.
3. Knowledge of dimensionality reduction techniques and kernel methods.
4. Knowledge of neural networks: training/testing and applications.
5. Knowledge of tracking filters.
6. Ability to use Matlab and Python programming frameworks, as well as the TensorFlow environment.
1. Knowledge of the basics of statistical learning.
2. Knowledge of the theory and methods of classification for continuous and discrete data using maximum likelihood and Bayesian methods.
3. Knowledge of dimensionality reduction techniques and kernel methods.
4. Knowledge of neural networks: training/testing and applications.
5. Knowledge of tracking filters.
6. Ability to use Python programming frameworks, as well as the TensorFlow environment.
The student must know the following concepts:
1. Random variables/processes and probability density function, mean and variance
2. Linear algebra
3. Programming in the Matlab language
At the beginning of the course, random variables and multivariate distributions are briefly reviewed.
The student must know the following concepts:
1. Random variables/processes and probability density function, mean and variance
2. Linear algebra
3. Programming in the Python language
At the beginning of the course, random variables and multivariate distributions are briefly reviewed.
Discrete-time random signals and processes, probability distributions, model fitting (1.2CFU), including:
• Introduction to machine learning; supervised vs. unsupervised learning, regression vs. classification, k-NN classifier, the curse of dimensionality, overfitting.
• Review of probability, continuous vs. discrete random variables, Bayes’ rule, independence, common probability distributions, Gaussian multivariate distribution.
Foundations of estimation theory (1.2CFU), including:
• Maximum likelihood estimation
• Bayesian and MAP estimation
• Application to Gaussian processes
Classification and latent linear models (1.2CFU), including:
• Hypothesis testing
• Bayesian decision theory and Bayesian classifiers
• Performance assessment of a classifiers
• Principal component analysis
• Kernel methods and support vector machines
Feedforward and convolutional deep neural networks (1.8CFU), including:
• Perceptrons and architecture of a neural network
• Backpropagation algorithm
• Loss functions, overfitting and regularization
• Convolutional networks and deep learning
Tracking filters (0.6CFU), including
• Kalman filter
Discrete-time random signals and processes, probability distributions, model fitting (1.2CFU), including:
• Introduction to machine learning; supervised vs. unsupervised learning, regression vs. classification, k-NN classifier, the curse of dimensionality, overfitting.
• Review of probability, continuous vs. discrete random variables, Bayes’ rule, independence, common probability distributions, Gaussian multivariate distribution.
Foundations of estimation theory (1.2CFU), including:
• Maximum likelihood estimation
• Bayesian and MAP estimation
• Application to Gaussian processes
Classification and latent linear models (0.6CFU), including:
• Hypothesis testing
• Bayesian decision theory and Bayesian classifiers
• Performance assessment of a classifiers
• Principal component analysis and kernel methods
Feedforward and convolutional deep neural networks (2.4CFU), including:
• Perceptrons and architecture of a neural network
• Backpropagation algorithm
• Loss functions, overfitting and regularization
• Convolutional networks and deep learning
• Neural networks for classification, segmentation and object detection
• Neural networks quantization
• Transformers and generative adversarial networks
Tracking filters (0.6CFU), including
• Kalman filter
Half of the course takes place in the LAIB laboratories, where students implement and assess the methods discussed during the lectures. Most of the activities are performed using Matlab, whereas Python and TensorFlow are employed for neural networks experiments.
A large part of the course takes place in the LAIB laboratories, where students implement and assess the methods discussed during the lectures. The lab activities are performed using the Python language, and possibly employing cloud-based environments such as Google CoLab.
The following texts will be used during the course:
Regarding statistical models and machine learning, the reference book is:
K.P. Murphy, “Machine learning – a probabilistic perspective”, MIT press, 2012.
Regarding neural networks, the reference book is the online book “Neural networks and deep learning” (2017), available at link: http://neuralnetworksanddeeplearning.com
The following texts will be used during the course:
Regarding statistical models and machine learning, the reference book is:
K.P. Murphy, “Machine learning – a probabilistic perspective”, MIT press, 2012.
Regarding neural networks, there is no reference book. The online book “Neural networks and deep learning” (2017), available at link: http://neuralnetworksanddeeplearning.com, can be used as a reference on some parts of the course
Slides; Libro di testo;
Lecture slides; Text book;
Modalità di esame: Prova orale obbligatoria; Elaborato scritto prodotto in gruppo;
Exam: Compulsory oral exam; Group essay;
...
The exam aims at assessing the knowledge and understanding of the topics described during the course, and the ability to critically discuss such topics.
The exam is oral. Each student has to bring the report containing the description of the experiments carried out during the computer labs, including the obtained results. The exam typically starts from a discussion of one or more experiments done during the computer labs, and can extend to span the whole set of topics presented during the course. A lot of emphasis is given to the theory underlying each of the experiments.
Possible topics addressed during the oral examination are:
- a description of the considered system (or algorithm) and of its simulation model,
- the theoretical background of the methods used in the Matlab-based experiments,
- a discussion of the main parameters adopted in the experiments,
- a clear presentation and discussion of the obtained results,
- a discussion of the methods used to evaluate the performance of the algorithms,
- a comparison with theoretical results (when applicable)
The oral presentation is evaluated based on its correctness, the level of knowledge that the student has acquired on the topic, the ability to apply the acquired know-how, to clearly communicate the technical material with accurate terms and to correctly analyze, interpret, and comment the obtained results. During the oral examinations, the students may not consult any material, other than the reports of the computer labs. The score of the oral examination is up to “30 e lode”, and the student passes the exam if they obtain at least 18/30.
Gli studenti e le studentesse con disabilità o con Disturbi Specifici di Apprendimento (DSA), oltre alla segnalazione tramite procedura informatizzata, sono invitati a comunicare anche direttamente al/la docente titolare dell'insegnamento, con un preavviso non inferiore ad una settimana dall'avvio della sessione d'esame, gli strumenti compensativi concordati con l'Unità Special Needs, al fine di permettere al/la docente la declinazione più idonea in riferimento alla specifica tipologia di esame.
Exam: Compulsory oral exam; Group essay;
The exam aims at assessing the knowledge and understanding of the topics described during the course, and the ability to critically discuss such topics.
The exam is oral, and also uses the report containing the description of the experiments carried out during the computer labs, including the obtained results. The exam typically starts from a discussion of one or more experiments done during the computer labs, and can extend to span the whole set of topics presented during the course. A lot of emphasis is given to the theory underlying each of the experiments.
Possible topics addressed during the oral examination are:
- a description of the considered system (or algorithm) and of its simulation model,
- the theoretical background of the methods used in the Matlab-based experiments,
- a discussion of the main parameters adopted in the experiments,
- a clear presentation and discussion of the obtained results,
- a discussion of the methods used to evaluate the performance of the algorithms,
- a comparison with theoretical results (when applicable)
The oral presentation is evaluated based on its correctness, the level of knowledge that the student has acquired on the topic, the ability to apply the acquired know-how, to clearly communicate the technical material with accurate terms and to correctly analyze, interpret, and comment the obtained results. During the oral examinations, the students may not consult any material, other than the reports of the computer labs. The score of the oral examination is up to “30 e lode”, and the student passes the exam if they obtain at least 18/30.
In addition to the message sent by the online system, students with disabilities or Specific Learning Disorders (SLD) are invited to directly inform the professor in charge of the course about the special arrangements for the exam that have been agreed with the Special Needs Unit. The professor has to be informed at least one week before the beginning of the examination session in order to provide students with the most suitable arrangements for each specific type of exam.