NAVER LABS Europe seminars are open to the public. This seminar is virtual and requires registration.
Date: 17th June 2021, 10:00 am (GMT +02.00)
Going from task to class-incremental learning
About the speaker: Joost van de Weijer is a senior scientist at the Computer Vision Center in Barcelona and leader of the LAMP team. He received his Ph.D. degree in 2005 from the University of Amsterdam. From 2005 to 2007, he was a Marie Curie Intra-European Fellow in the LEAR Team, INRIA Rhone-Alpes, France. From 2008 to 2012, he was a Ramon y Cajal Fellow at the Universidad Autonoma de Barcelona. His main research is on color imaging, active learning, continual learning and domain adaptation.
Abstract:Incremental learning aims to learn from a temporary correlated stream of data. In class-incremental learning, where the data is arriving in tasks (each task having data from a different set of classes) a learner is required to learn new tasks while preventing the forgetting of previous tasks. At inference time the learner should be able to classify data into all the classes it has previously seen. A naïve approach that adapts to each new task would suffer from catastrophic forgetting and would not be able to classify the data into the classes of previous tasks. In this talk, I will first focus on the more difficult case of exemplar-free incremental learning, where the learner is not allowed to store any data from previous tasks. I will show that incremental learning of metric spaces is more robust against forgetting than the commonly used cross-entropy loss. Next, I will discuss the usage of feature replay to prevent the forgetting of new classes. Finally, I will comment on the usage of compressed exemplars for replay.