
Seminar: Cleanlab 2.0: Making ML work with messy, real-world data

60th Annual Meeting of the Association for Computational Linguistics – ACL 2022
From distillation to hard negative sampling: making sparse neural IR models more effective
Exploring tokens in contextualized late interaction over BERT (ColBERT)
DiPCAN: Distilling Privileged information for Crowd-Aware Navigation
On the road to online adaptation for semantic image segmentation

On multimodal speech-text pre-trained models
Multimodal pre-training has the potential of being a game changer in spoken language processing. In this blog, we review 3 recent papers on the topic published by Meta, Microsoft (and academic partners) and Google

Deep regression on manifolds: a 3D rotation case study
Theoretical and experimental findings to improve regression applications: a 3D rotation case study. Code.

PoseBERT
A novel, plug and play model for human 3D shape estimation of the body or hands, in videos which is trained by mimicking the BERT algorithm from the natural language processing community.
GLOBAL AI R&D BELT
ACADEMIA – EU/GOVT – ENTREPRENEURS
Our partnerships range from long-term fundamental research to investment in products and services.