Attenuating Catastrophic Forgetting by Joint Contrastive and Incremental Learning

Quentin Ferdinand, Benoit Clement, Quentin Oliveau, Gilles Le Chenadec, Panagiotis Papadakis

Research output: Contribution to journalConference articlepeer-review

Abstract


In class incremental learning, discriminative models are trained to classify images while adapting to new instances and classes incrementally. Training a model to adapt to new classes without total access to previous class data, however, leads to the known problem of catastrophic forgetting of the previously learnt classes. To alleviate this problem, we show how we can build upon recent progress on contrastive learning methods. In particular, we develop an incremental learning approach for deep neural networks operating both at classification and representation level which alleviates forgetting and learns more general features for data classification. Experiments performed on several datasets demonstrate the superiority of the proposed method with respect to well known state-of-the-art methods.
Original languageEnglish
Pages (from-to)3781-3788
Number of pages8
JournalProceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops
DOIs
Publication statusPublished - Jun 2022
Externally publishedYes

Keywords

  • Learning systems
  • training
  • deep learning
  • adaptation models
  • computer vision
  • conferences
  • computational modeling

Fingerprint

Dive into the research topics of 'Attenuating Catastrophic Forgetting by Joint Contrastive and Incremental Learning'. Together they form a unique fingerprint.

Cite this