KEYWORD |
Area Engineering
Transformers for Embedded Machine Learning
Reference persons MARIO ROBERTO CASU
Research Groups VLSILAB (VLSI theory, design and applications)
Description This MSc thesis aims to explore the application of Transformer models in embedded machine learning environments. The motivation behind this research stems from the growing prevalence of embedded systems in everyday devices, such as smartphones and IoT sensors, and the need for more efficient and powerful machine learning models to operate within these resource-constrained environments.
Transformers, which have shown remarkable success in natural language processing and other domains, offer a promising alternative to traditional machine learning models used in embedded systems. This thesis will evaluate the feasibility and performance of Transformer models on embedded hardware, focusing on aspects such as speed, accuracy, and resource utilization. Additionally, the research will involve developing and testing optimization techniques to adapt Transformers for embedded applications, ensuring they can operate effectively within the limited computational power and memory available.
The study will also identify and evaluate potential use cases for Transformers in embedded systems, such as real-time language translation, anomaly detection, and image recognition. By conducting a comprehensive literature review and designing experiments to test the models on various embedded hardware platforms, this research aims to provide valuable insights into the practical deployment of Transformers in embedded machine learning applications.
Ultimately, the findings of this thesis could pave the way for more advanced and efficient machine learning applications in embedded systems, contributing to the broader field of embedded machine learning and enhancing the capabilities of everyday devices.
Deadline 26/07/2025
PROPONI LA TUA CANDIDATURA