Politecnico di Torino | |||||||||||||||||
Anno Accademico 2017/18 | |||||||||||||||||
04JTZBH Statistical Signal Processing |
|||||||||||||||||
Corso di Laurea Magistrale in Ict For Smart Societies (Ict Per La Societa' Del Futuro) - Torino |
|||||||||||||||||
|
|||||||||||||||||
|
|||||||||||||||||
Presentazione
The course introduces statistical and machine learning methods for processing signals and extracting information from them. The main goal is to describe techniques and methods designed to extract information from a finite data set, which can be modelled as sequence of measured (or acquired) samples of a random signal. Such methods are based on either a signal model, or a model learned from training data.
The course initially reviews the foundations of discrete-time random signals. The first part of the course deals with parameter estimation theory in both maximum likelihood and Bayesian sense. The second part extends these notions to hypothesis testing and classification. Next, feedforward and convolutional neural networks are introduced. Finally statistical tracking filters are introduced. Throughout the course, the concepts above are illustrated with practical examples taken from ICT applications. Half of the course takes place in the LAIB laboratories, where students implement and assess the methods discussed during the lectures. Most of the activities are performed using Matlab and Python, whereas the TensorFlow framework (also based on Python) is employed for neural networks experiments. This course is designed jointly with the course "ICT for health", with the objective to provide students with a coordinated "machine learning" approach that can be applied to several ICT problems; in particular, this course deals primarily with machine learning basics, classification and neural networks, while the "ICT for health" course addresses regression and clustering topics. |
Risultati di apprendimento attesi
1. Knowledge of the foundations of discrete-time random signals
2. Knowledge of the theory and methods of estimation theory 3. Knowledge of the theory and methods of classification 4. Knowledge of neural networks: training/testing and applications 5. Knowledge of tracking filters 6. Ability to use Matlab and Python programming frameworks, as well as the TensorFlow environment. |
Prerequisiti / Conoscenze pregresse
The student must know the following concepts of probability theory and signal processing:
1. Random variables/processes and probability density function, mean and variance 2. Linear time-invariant (LTI) systems 3. Linear algebra 4. Programming in the Matlab language At the beginning of the course, random variables and processes are briefly reviewed. |
Programma
Discrete-time random signals and processes, probability distributions, model fitting (1.2CFU), including:
Introduction to machine learning; supervised vs. unsupervised learning, regression vs. classification, k-NN classifier, the curse of dimensionality, overfitting. Review of probability, continuous vs. discrete random variables, Bayes rule, independence, common probability distributions, Gaussian multivariate distribution, transformations of random variables. Foundations of estimation theory (1.2CFU), including: Maximum likelihood estimation Bayesian and MAP estimation application to Gaussian processes Classification and latent linear models (1.2CFU), including: Hypothesis testing Bayesian decision theory and Bayesian classifiers Performance assessment of a classifiers Principal component analysis Basics of sparse models Feedforward and convolutional deep neural networks (1.8CFU), including: Perceptrons and architecture of a neural network Backpropagation algorithm Loss functions, overfitting and regularization Convolutional networks and deep learning Tracking filters (0.6CFU), including Kalman filtering Basics of particle filtering |
Organizzazione dell'insegnamento
Half of the course takes place in the LAIB laboratories, where students implement and assess the methods discussed during the lectures. Most of the activities are performed using Matlab, whereas Python and TensorFlow are employed for neural networks experiments.
|
Testi richiesti o raccomandati: letture, dispense, altro materiale didattico
The following texts will be used during the course:
Regarding statistical models and machine learning, the reference book is: K.P. Murphy, "Machine learning a probabilistic perspective", MIT press, 2012. Parts of the material may also be found in the following book: Steven M. Kay, Fundamentals of Statistical signal processing: Estimation Theory, Prentice Hall,1993 Regarding neural networks, the reference book is the online book "Neural networks and deep learning" (2017), available at link: http://neuralnetworksanddeeplearning.com |
Criteri, regole e procedure per l'esame
The exam is oral. It consists of a discussion of the experiments done during the computer labs. Each student has to prepare a report containing the description of the experiments, including the obtained results. The report will be used during the oral examination to describe in detail selected experiments chosen by the professor.
Possible topics addressed during the oral examination are: - a description of the considered system (or algorithm) and of its simulation model, - the theoretical background of the methods used in the Matlab-based experiments, - a discussion of the main parameters adopted in the experiments, - a clear presentation and discussion of the obtained results, - a discussion of the methods used to evaluate the performance of the algorithms, - a comparison with theoretical results (when applicable) The oral presentation is evaluated based on its correctness, the level of knowledge that the student has acquired on the topic, the ability to apply the acquired know-how, to clearly communicate the technical material with accurate terms and to correctly analyze, interpret, and comment the obtained results. During the oral examinations, the students may not consult any material, other than the reports of the computer labs. The score of the oral examination is up to "30 e lode". |
Orario delle lezioni |
Statistiche superamento esami |
|