Servizi per la didattica
PORTALE DELLA DIDATTICA

Computational linear algebra for large scale problems

01TWYSM

A.A. 2021/22

Course Language

Inglese

Course degree

Master of science-level of the Bologna process in Data Science And Engineering - Torino

Course structure
Teaching Hours
Lezioni 50
Esercitazioni in aula 15
Esercitazioni in laboratorio 15
Teachers
Teacher Status SSD h.Les h.Ex h.Lab h.Tut Years teaching
Teaching assistant
Espandi

Context
SSD CFU Activities Area context
MAT/03
MAT/08
2
6
C - Affini o integrative
C - Affini o integrative
Attività formative affini o integrative
Attività formative affini o integrative
2019/20
This course aims at presenting the mathematical and numerical foundation of several methods applied in Data Science. Analysis of large scale data sets requires specific algebraic tools in order to extract the most relevant information from data. This problem is tackled by the application of several mathematical tools; this course is designed to present them and to explain pro and cons of their application to realistic data sets.
This course aims at presenting the mathematical and numerical foundation of several methods applied in Data Science. Analysis of large scale data sets requires specific algebraic tools in order to extract the most relevant information from data. This problem is tackled by the application of several mathematical tools; this course is designed to present them and to explain pro and cons of their application to realistic data sets.
Acquire a deep knowledge about the following topics: • Basic linear algebra tools. • Approximation of data and functions. • Dense and sparse matrices, matrix operations. • Eigenvalues and eigenvectors computations: numerical methods and common tools for large scale matrices. Stability and conditioning. • Vector rotations, orthogonalization, projections. • Iterative solutions of large scale linear systems. • Computation and theoretical properties of Singular Value Decomposition (SVD), Lanczos method. • Randomized SVD. • Non-negative matrix factorization. • Generalized inverse matrix and Moore–Penrose inverse. • Dimensional reduction of a problem and Principal Component Analysis (PCA). • C/C++ and Python common numerical libraries. Ability to: - Apply the previous topics to practical problems. - Choose and use common software for numerical linear algebra in Data Science correctly analyzing the feasibility of a given approach on the given computer architecture. - Analyze: conditioning of a problem, stability of a numerical approach, reliability of numerical solutions and scalability issues on modern computer architectures.
Acquire a deep knowledge about the following topics: • Basic linear algebra tools. • Approximation of data and functions. • Dense and sparse matrices, matrix operations. • Eigenvalues and eigenvectors computations: numerical methods and common tools for large scale matrices. Stability and conditioning. • Vector rotations, orthogonalization, projections. • Iterative solutions of large scale linear systems. • Computation and theoretical properties of Singular Value Decomposition (SVD), Lanczos method. • Randomized SVD. • Non-negative matrix factorization. • Generalized inverse matrix and Moore–Penrose inverse. • Dimensional reduction of a problem and Principal Component Analysis (PCA). • C/C++ and Python common numerical libraries. Ability to: - Apply the previous topics to practical problems. - Choose and use common software for numerical linear algebra in Data Science correctly analyzing the feasibility of a given approach on the given computer architecture. - Analyze: conditioning of a problem, stability of a numerical approach, reliability of numerical solutions and scalability issues on modern computer architectures.
Basic knowledge of basic linear algebra and calculus is a prerequisite, as well as a basic coding ability and computer knowledge.
Basic knowledge of basic linear algebra and calculus is a prerequisite, as well as a basic coding ability and computer knowledge.
1. Basic linear algebra tools: vector spaces, bases, linear operators, matrices, eigenvalues and eigenvectors, norms. 2. Approximation of data and functions: global and piecewise interpolation, least square approximation, numerical tools. 3. Dense and sparse matrices, matrix operations on several computer architectures and performances analysis on CPUs and GPUs. 4. Eigenvalues and eigenvectors computations: numerical methods and common tools for large scale matrices. Stability and conditioning. 5. Vector rotations, orthogonalization, projections: Grahm-Schmidt, Givens, Householder methods and QR factorization. 6. Iterative solutions of large scale linear systems: applicability, convergence, computational cost and memory requirements, preconditioning. 7. Computation and theoretical properties of Singular Value Decomposition (SVD). 8. Generalized inverse matrix and Moore–Penrose inverse. 9. Dimensional reduction of a problem and Principal Component Analisys (PCA). 10. C/C++ and Python common numerical libraries.
1. Basic linear algebra tools: vector spaces, bases, linear operators, matrices, eigenvalues and eigenvectors, norms. 2. Approximation of data and functions: global and piecewise interpolation, least square approximation, numerical tools. 3. Dense and sparse matrices, matrix operations on several computer architectures and performances analysis on CPUs and GPUs. 4. Eigenvalues and eigenvectors computations: numerical methods and common tools for large scale matrices. Stability and conditioning. 5. Vector rotations, orthogonalization, projections: Grahm-Schmidt, Givens, Householder methods and QR factorization. 6. Iterative solutions of large scale linear systems: applicability, convergence, computational cost and memory requirements, preconditioning. 7. Computation and theoretical properties of Singular Value Decomposition (SVD). 8. Generalized inverse matrix and Moore–Penrose inverse. 9. Dimensional reduction of a problem and Principal Component Analisys (PCA). 10. C/C++ and Python common numerical libraries.
Theoretical lectures and practice classes. Theoretical lectures are devoted to the presentation of the topics, with definitions, properties, introductory examples. The practice classes are devoted to train the students’ abilities to solve problems and exercises and to perform computations and simulations with common tools.
Theoretical lectures and practice classes. Theoretical lectures are devoted to the presentation of the topics, with definitions, properties, introductory examples. The practice classes are devoted to train the students’ abilities to solve problems and exercises and to perform computations and simulations with common tools.
Slides presented during lesson will be made available through the Portale della Didattica. Other material will be suggested in class and, if possible, made available through the Portale della Didattica. Suggested textbook: Linear Algebra and Learning from Data, G. Strang, Cambridge University Press, 2019, ISBN: 9780692196380 Iterative Methods for Sparse Linear Systems, Y. Saad, Society for Industrial and Applied Mathematics Philadelphia, PA, USA, 2003, ISBN:0898715342
Slides presented during lesson will be made available through the Portale della Didattica. Other material will be suggested in class and, if possible, made available through the Portale della Didattica. Suggested textbook: Linear Algebra and Learning from Data, G. Strang, Cambridge University Press, 2019, ISBN: 9780692196380 Iterative Methods for Sparse Linear Systems, Y. Saad, Society for Industrial and Applied Mathematics Philadelphia, PA, USA, 2003, ISBN:0898715342
Modalità di esame: Prova orale obbligatoria;
Exam: Compulsory oral exam;
Exam: Oral exam with discussion of homeworks assigned during the course. Three homework assignments (HW1, HW2, HW3) that will be assigned to the students during the course HW1 and HW2 consist of exercises that are aimed to evaluate the students in using the methods presented. HW3 is an application of the methods learned to a problem chosen by the student. An oral test will then consist of two parts: a) a discussion of the submitted HW1, HW2, and HW3 reports, aimed at testing the depth of the students’ understanding of the subjects and their ability to explain, defend, reflect, critically evaluate, and possibly improve their work, proving the real acquisition of the abilities listed in the expected learning outcomes section. b) a presentation of a topic studied in the course covering both theoretical aspects and possibly their implementation and applications proving the real acquisition of the knowledge listed in the expected learning outcomes section. Grading: the maximum grade for HW1, HW2, and HW3, upon the discussion detailed at point (a) above, is of 14 points. The maximum grade for part (b) of the oral test is 18 points. The final course grade is then obtained by summing up the final grades of part (a) and (b) of the oral test.
Esporta Word


© Politecnico di Torino
Corso Duca degli Abruzzi, 24 - 10129 Torino, ITALY
Contatti