PORTALE DELLA DIDATTICA

PORTALE DELLA DIDATTICA

PORTALE DELLA DIDATTICA

Elenco notifiche



Geometric Learning, Time-Variant Data analysis, and Anomaly Detection

01HZSSM, 01HZSWS

A.A. 2025/26

Course Language

Inglese

Degree programme(s)

Master of science-level of the Bologna process in Data Science And Engineering - Torino

Course structure
Teaching Hours
Lezioni 40
Esercitazioni in aula 20
Esercitazioni in laboratorio 20
Lecturers
Teacher Status SSD h.Les h.Ex h.Lab h.Tut Years teaching
Vaccarino Francesco Professore Associato MATH-02/B 40 20 0 0 1
Co-lectures
Espandi

Context
SSD CFU Activities Area context
MAT/03
SECS-S/01
4
4
C - Affini o integrative
C - Affini o integrative
Attivitΰ formative affini o integrative
Attivitΰ formative affini o integrative
2025/26
his advanced course explores the convergence of geometric learning, graph neural networks (GNNs), and time-variant data analysis, with a strong emphasis on anomaly detection in complex and dynamic systems. The course begins with the foundations of geometric machine learning, focusing on the representation of data in non-Euclidean domains, such as graphs and manifolds. It then progresses to cutting-edge developments in graph neural networks, including architectures adapted to temporal and heterogeneous graph data. The course also introduces techniques for time-series modeling, both classical and modern (e.g., machine learning and deep learning approaches), and culminates in a comprehensive module on anomaly detection, covering both statistical and learning-based methods. Throughout the course, students will engage with real-world datasets and hands-on programming exercises, gaining both theoretical insights and practical expertise.
his advanced course explores the convergence of geometric learning, graph neural networks (GNNs), and time-variant data analysis, with a strong emphasis on anomaly detection in complex and dynamic systems. The course begins with the foundations of geometric machine learning, focusing on the representation of data in non-Euclidean domains, such as graphs and manifolds. It then progresses to cutting-edge developments in graph neural networks, including architectures adapted to temporal and heterogeneous graph data. The course also introduces techniques for time-series modeling, both classical and modern (e.g., machine learning and deep learning approaches), and culminates in a comprehensive module on anomaly detection, covering both statistical and learning-based methods. Throughout the course, students will engage with real-world datasets and hands-on programming exercises, gaining both theoretical insights and practical expertise.
By the end of the course, students will be able to: Knowledge and Understanding • Grasp the mathematical underpinnings of geometric learning and its applications to manifolds, symmetry groups, and invariant representations. • Understand the architecture and functioning of graph neural networks, including: • Graph Convolutional Networks (GCN) • Graph Attention Networks (GAT) • Graph Isomorphism Networks (GIN) • Temporal GNNs (T-GNN, EvolveGCN, TGAT) • Comprehend the principles of time-series modeling, including: • Classical models: ARIMA, VAR, Kalman Filters • Machine learning approaches: Random Forests, Gradient Boosting • Deep learning methods: RNNs, LSTMs, TCNs • Master the theory and methods for anomaly detection, including: • Statistical tests and control charts • Distance- and density-based approaches • Autoencoders and hybrid deep learning models • Methods specific to temporal and graph data Practical and Computational Skills • Implement GNNs and time-series models using PyTorch Geometric, DGL, scikit-learn, stats models, and TensorFlow/Keras. • Conduct feature engineering and data preprocessing for time-variant and graph-structured data. • Evaluate model performance using metrics such as ROC-AUC, F1-score, precision-recall curves, and time-aware diagnostics. • Develop anomaly detection pipelines for real-time monitoring and post-hoc analysis. Application and Critical Thinking • Apply learned techniques to domains such as: • Cybersecurity (e.g., intrusion detection) • Finance (e.g., fraud detection, volatility shifts) • Healthcare (e.g., patient monitoring) • Industrial systems (e.g., predictive maintenance) • Critically evaluate and compare different modeling choices for real-world problems involving non-Euclidean, temporal, and dynamic data.
By the end of the course, students will be able to: Knowledge and Understanding • Grasp the mathematical underpinnings of geometric learning and its applications to manifolds, symmetry groups, and invariant representations. • Understand the architecture and functioning of graph neural networks, including: • Graph Convolutional Networks (GCN) • Graph Attention Networks (GAT) • Graph Isomorphism Networks (GIN) • Temporal GNNs (T-GNN, EvolveGCN, TGAT) • Comprehend the principles of time-series modeling, including: • Classical models: ARIMA, VAR, Kalman Filters • Machine learning approaches: Random Forests, Gradient Boosting • Deep learning methods: RNNs, LSTMs, TCNs • Master the theory and methods for anomaly detection, including: • Statistical tests and control charts • Distance- and density-based approaches • Autoencoders and hybrid deep learning models • Methods specific to temporal and graph data Practical and Computational Skills • Implement GNNs and time-series models using PyTorch Geometric, DGL, scikit-learn, stats models, and TensorFlow/Keras. • Conduct feature engineering and data preprocessing for time-variant and graph-structured data. • Evaluate model performance using metrics such as ROC-AUC, F1-score, precision-recall curves, and time-aware diagnostics. • Develop anomaly detection pipelines for real-time monitoring and post-hoc analysis. Application and Critical Thinking • Apply learned techniques to domains such as: • Cybersecurity (e.g., intrusion detection) • Finance (e.g., fraud detection, volatility shifts) • Healthcare (e.g., patient monitoring) • Industrial systems (e.g., predictive maintenance) • Critically evaluate and compare different modeling choices for real-world problems involving non-Euclidean, temporal, and dynamic data.
Course Prerequisites • Linear Algebra, Probability and Statistics • Programming in Python • Fundamentals of Machine Learning • Basic knowledge of graph theory and differential geometry (beneficial but not strictly required) These prerequisites are typically covered in the first-year coursework of the Master’s in Data Science and Engineering.
Course Prerequisites • Linear Algebra, Probability and Statistics • Programming in Python • Fundamentals of Machine Learning • Basic knowledge of graph theory and differential geometry (beneficial but not strictly required) These prerequisites are typically covered in the first-year coursework of the Master’s in Data Science and Engineering.
Module 1: Introduction to Geometric Learning • Symmetries and group actions in data • Invariant and equivariant representations • Manifold learning: Isomap, LLE, Diffusion Maps • Gauge theories and local geometries Module 2: Graph Neural Networks (GNNs) • Message-passing neural networks (MPNNs) • GCN, GAT, GIN • Heterogeneous and dynamic graphs • Temporal GNNs: T-GCN, TGAT, EvolveGCN • Applications: node classification, link prediction, graph classification Module 3: Time-Variant Data and Time Series Modeling • Nature and challenges of temporal data • Classical time series analysis: stationarity, autocorrelation, seasonality • Forecasting with ARIMA, VAR, Kalman Filters • Machine learning and deep learning for time series: RNN, LSTM, GRU, Transformer Module 4: Anomaly Detection in Time and Graph Data • Definitions and types of anomalies: point, contextual, collective • Statistical methods: z-scores, PCA, control charts • Machine learning-based methods: Isolation Forest, One-Class SVM • Deep learning-based methods: Autoencoders, LSTM-AE, GANs • Graph-based anomaly detection: subgraph detection, spectral methods, GAD models
Module 1: Introduction to Geometric Learning • Symmetries and group actions in data • Invariant and equivariant representations • Manifold learning: Isomap, LLE, Diffusion Maps • Gauge theories and local geometries Module 2: Graph Neural Networks (GNNs) • Message-passing neural networks (MPNNs) • GCN, GAT, GIN • Heterogeneous and dynamic graphs • Temporal GNNs: T-GCN, TGAT, EvolveGCN • Applications: node classification, link prediction, graph classification Module 3: Time-Variant Data and Time Series Modeling • Nature and challenges of temporal data • Classical time series analysis: stationarity, autocorrelation, seasonality • Forecasting with ARIMA, VAR, Kalman Filters • Machine learning and deep learning for time series: RNN, LSTM, GRU, Transformer Module 4: Anomaly Detection in Time and Graph Data • Definitions and types of anomalies: point, contextual, collective • Statistical methods: z-scores, PCA, control charts • Machine learning-based methods: Isolation Forest, One-Class SVM • Deep learning-based methods: Autoencoders, LSTM-AE, GANs • Graph-based anomaly detection: subgraph detection, spectral methods, GAD models
• Class materials, datasets, and code notebooks will be made available on the course GitHub repository. • Guest lectures and seminars by experts in AI and complex systems may be included. • The course can be extended to include a mini-research project for PhD students.
• Class materials, datasets, and code notebooks will be made available on the course GitHub repository. • Guest lectures and seminars by experts in AI and complex systems may be included. • The course can be extended to include a mini-research project for PhD students.
• 60 hours of lectures • 20 hours of hands-on sessions, including coding labs, case studies, and project work • Continuous assessment through assignments and a final project or oral exam
• 60 hours of lectures • 20 hours of hands-on sessions, including coding labs, case studies, and project work • Continuous assessment through assignments and a final project or oral exam
Geometric Learning Bronstein et al., Geometric Deep Learning (2024) Graph Neural Networks Hamilton, Graph Representation Learning (2020) Time Series Analysis Hyndman & Athanasopoulos, Forecasting: Principles and Practice (open-access) Anomaly Detection Aggarwal, Outlier Analysis (2017) Visual/No-code Tools Berthold et al., KNIME: The Konstanz Information Miner
Geometric Learning Bronstein et al., Geometric Deep Learning (2024) Graph Neural Networks Hamilton, Graph Representation Learning (2020) Time Series Analysis Hyndman & Athanasopoulos, Forecasting: Principles and Practice (open-access) Anomaly Detection Aggarwal, Outlier Analysis (2017) Visual/No-code Tools Berthold et al., KNIME: The Konstanz Information Miner
Slides; Dispense; Libro di testo; Esercizi; Esercizi risolti; Esercitazioni di laboratorio; Esercitazioni di laboratorio risolte; Materiale multimediale ;
Lecture slides; Lecture notes; Text book; Exercises; Exercise with solutions ; Lab exercises; Lab exercises with solutions; Multimedia materials;
Modalitΰ di esame: Elaborato scritto individuale;
Exam: Individual essay;
... Mini project report describing their workflow, rationale, and findings. Oral discussion of their conclusions and questions on the theoretical aspects of the methodologies they implied in their project. Weight: 50% mini project/50% oral exam.
Gli studenti e le studentesse con disabilitΰ o con Disturbi Specifici di Apprendimento (DSA), oltre alla segnalazione tramite procedura informatizzata, sono invitati a comunicare anche direttamente al/la docente titolare dell'insegnamento, con un preavviso non inferiore ad una settimana dall'avvio della sessione d'esame, gli strumenti compensativi concordati con l'Unitΰ Special Needs, al fine di permettere al/la docente la declinazione piω idonea in riferimento alla specifica tipologia di esame.
Exam: Individual essay;
Mini project report describing their workflow, rationale, and findings. (0-16 points) Oral discussion of their conclusions and questions on the theoretical aspects of the methodologies they implied in their project. (0-16 points) Weight: 50% mini project/50% oral exam. The laude is assigned to students with a total score higher than 30 points.
In addition to the message sent by the online system, students with disabilities or Specific Learning Disorders (SLD) are invited to directly inform the professor in charge of the course about the special arrangements for the exam that have been agreed with the Special Needs Unit. The professor has to be informed at least one week before the beginning of the examination session in order to provide students with the most suitable arrangements for each specific type of exam.
Esporta Word