Deep models deployed on edge devices frequently encounter distribution shifts in sensed data. This issue primarily arises from a sensor drift, varying environmental conditions or context, and leads to a discrepancy between the real and the training data distributions. Model re-training using backpropagation is expensive, beside derivate computation, the memory overhead scales linearly with model depth and batch size. Machine learning frameworks for resource-constrained IoT devices therefore optimize model inference and usually do not support on-device model updates. Unfortunately, updating even a small fraction of model parameters still requires a fully-features backward pass. In this project, we aim to support on-device model adaptation by leveraging bio-inspired learning theory. Interested? Please contact us for more details!
Download as PDF
Student Target Groups:
- Students in ICE
- Students in Computer Science
- Students in Software Engineering.
Thesis Type:
- Master Project / Master Thesis
Goal and Tasks:
The goal of this project is to extend the Tensor- Flow Lite Micro framework by adding the support for runtime fine-tuning of models in response to distribution shifts, utilizing the principles of Hebbian learning theory. The project includes the following tasks:
- In-depth understanding of standard model retraining using backpropagation;
- Familiarize yourself with TFL Micro and how it can be expended to support model updates using backpropagation-free methods;
- Implement the training algorithm and analyze its performance;
- Summarize the results in a written report.
Recommended Prior Knowledge:
- Eager to learn about deep neural networks and on-device adaptation on IoT devices;
- Programming skills in Python;
- Prior experience with machine learning frameworks (e.g., TensorFlow, PyTorch).
Start:
Contact: