Quentin Ferdinand PhD

v.2023

Mitigating catastrophic forgetting via feature transfer and knowledge consolidation for deep class-incremental learning

Context

  • This project is part of the Machine Learning and Control project.

  • Funding is from Naval Group.

  • Supervisors : Quentin Oliveau, Gilles LE CHENADEC, Panagiotis Papadakis and Benoit Clement

Defense (link)

  • November 22nd, 2023 at ENSTA Bretagne

  • a video link is available is here

  • Dissertation is available here (link) and the slides are here (link)

Board of Examiners

  • Mme. Céline HUDELOT , Professor, CentraleSupélec - reviewer - (rapport)

  • M. Sébastien LEFEVRE, Professor, Université Bretgne Sud - reviewer - (rapport)

  • M. Panagiotis PAPADAKIS Maitre de Conférence, HDR, IMT Atlantique

  • M. Benoit CLEMENT Benoît, Professor, ENSTA Bretagne, IRL CROSSING, Adelaide, directeur de thèse

  • M. LE CHENADEC Gilles, Enseignant Chercheur, ENSTA Bretagne, Lab-STICC, Brest

  • M. Quentin OLIVEAU Ingénieur PhD, Naval Group

Abstract

Deep learning methods are designed for closed-set recognition, where a predefined set of known classes is assumed. However, in real-world scenarios, open-set recognition is more realistic, allowing for the possibility of encountering unknown or novel classes during testing. Class-incremental learning specifically addresses this problem by focusing on continuously improving models through the incorporation of new categories over time. Unfortunately, deep neural networks trained in this manner suffer from catastrophic forgetting, resulting in significant performance degradation on previously encountered classes. While various methods have been explored to alleviate this issue by rectifying the classification layer of deep incremental models, an equally important aspect resides in the degradation of feature quality, which can impede downstream classification performance. This thesis specifically focuses on investigating diverse approaches to enhance feature quality, facilitating adaptation to new classes while preserving discriminative features for past classes during incremental training. Specifically two methods have been proposed and rigorously evaluated on widely established benchmarks, attaining performances either comparable or superior to the state of the art. The first approach presented investigates the use of contrastive methods during incremental learning in order to improve the features extracted by incremental models while the second one uses an expansion and compression scheme to greatly reduce the forgetting happening at a feature level.

Mots clés: incremental learning, catastrophic forgetting, knowledge distillation, convolutional neural network, supervised learning

Publications

  • Ferdinand Quentin, Clement Benoit, Oliveau Quentin, Le Chenadec Gilles, and Papadakis Panagiotis (2022). “Attenuating catastrophic forgetting by joint contrastive and incremental learning”. In the 3rd Workshop of the Conference on Computer Vision and Pattern Recognition 2022 (CVPR)

1000