Continual adaptation of visual representations via domain randomization and meta-learning
News
Project summary
The de-facto standard learning paradigm for machine learning systems involves training once, on large amounts of training samples. Yet, it is often the case that the underlying knowledge of a model need to be improved over its lifespan. Hence, it is desirable being able to update this model when new data become available, without forgetting previously learned tasks. This problem is at the heart of continual learning research. In this project, we are interested in the specific problem where samples associated with new domains are available at different times over the model’s lifespan, and we desire learning such domains, while retaining good performance on old ones. We define this problem as continual domain adaptation, and propose different learning scenarios where the model is exposed to samples from different domains at different stages – for example, learning to classify objects in photos first, and in a second time learning to classify objects portrayed as sketches.
Focusing on computer vision tasks, we propose different solutions that rely on using image transformations. First, we assess the effectiveness of domain randomization to learn models that are more robust against catastrophic forgetting – when exposed to domain shifts from one training phase to the next one. Next, we propose a more sophisticated solution, where we rely on meta-learning to train models that are inherently more robust against catastrophic forgetting. Here, image transformations are used to generate meta-domains, that the proposed algorithms rely on.
BibTeX
@InProceedings{volpi2021cvpr, author = {Volpi, Riccardo and Larlus, Diane and Rogez, Gregory}, title = {Continual Adaptation of Visual Representations via Domain Randomization and Meta-learning}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, year = {2021} }
This work is part of the Lifelong representation learning chair of the MIAI research institute.