PORTALE DELLA DIDATTICA

Ricerca CERCA
  KEYWORD

Geospatial Foundation Model at the edge

keywords EARTH OBSERVATION, EDGE COMPUTING, FOUNDATION MODELS

Reference persons ALESSIO BURRELLO, DANIELE JAHIER PAGLIARI

External reference persons Matteo Risso

Research Groups ELECTRONIC DESIGN AUTOMATION - EDA

Thesis type EXPERIMENTAL

Description Nowadays, Foundation Models (FoMo) represent the go-to approach to develop AI-based systems. FoMo are deep learning models equipped with broad knowledge capabilities obtained thanks to pre-training on a large and heterogeneous corpus of data. Then, the FoMo can be efficiently fine-tuned to a specific downstream task.
Recently, the FoMo paradigm has been applied with good results also to the domain of Geospatial AI for earth observation to solve tasks such as urban planning, disaster management and environmental monitoring.
Nonetheless, FoMos are typically large and over-parametrized, hindering their execution on edge devices (e.g., on-board a satellite) and requiring costly and slow communications between a cloud server where inference happens.
The candidate will explore the usage of training-free optimization techniques such as pruning and/or quantization to optimize the FoMo in order to enable the execution on edge devices.
First, the candidate will study the literature to familiarize with both the topics of Geospatial FoMos and of training-free optimizations.
Then, the candidate will study how to apply such optimizations on the considered FoMos, exploring both the direct optimization of the plain FoMo and the optimization of the FoMo already fine-tuned on downstream tasks (e.g., classification, semantic segmentation).
Optionally, the candidate will explore the possibility of doing hardware-aware optimization, i.e., exploiting characteristics of the target hardware to devise better-specialized models.

Useful Readings: https://arxiv.org/abs/2212.14532 [1],
https://arxiv.org/abs/2302.04089 [2]

Required skills Proficiency in Python is required. Familiarity with Deep Learning, and PyTorch.


Deadline 31/10/2025      PROPONI LA TUA CANDIDATURA