Mitigating Catastrophic Forgetting in Continual Learning: A Data-centric Approach

This thesis project aims to address the significant challenge of catastrophic forgetting in continual learning, with a focus on developing strategies to mitigate its impact. The primary objective is to prepare a dataset for training a model in continual learning, employing various continual learning strategies to implement and evaluate the concept. This project offers a unique opportunity for a motivated student to contribute to the advancement of continual learning research while gaining valuable hands-on experience in mitigating catastrophic forgetting.

By applying Continual leaning we’ll mitigate unbalanced, small and low-quality datasets, independency and identically distributed data, changes in data domains over time and no need to handle all classes during model training which would be a perfect match in industry.

Download

Thesis Type:

  • Student projects/thesis

Goal and Tasks:

  • For the dataset preparation, the tasks include curating and preprocessing the dataset to ensure its suitability for continual learning, along with exploring strategies to maintain data relevance over time.
  • In the continual learning implementation phase, diverse strategies such as elastic weight consolidation (EWC), rehearsal, and synaptic consolidation will be employed to train the model on the prepared dataset.
  • The analysis of catastrophic forgetting involves conducting a thorough examination of the trained model's performance over time. This includes employing mathematical operations to identify instances and patterns of catastrophic forgetting, as well as investigating the contributing factors.
  • Mitigation strategies will be developed in the final phase, including the creation of an algorithm capable of continually classifying data, even in the presence of new classes or evolving data distributions. Proposed strategies to mitigate catastrophic forgetting will be implemented based on the results of the analysis.

Recommended Prior Knowledge:

  • Solid Background or interest in machine learning.
  • Proficiency in programming languages such as Python.
  • Solid understanding of data preprocessing techniques and model evaluation.
  • Interest to explore, implement, and adapt state-of-the-art continual learning strategies.

Start:

  • a.s.a.p.

Duration

  • Duration in months: 6-12 months

Contact: