NATURAL LANGUAGE PROCESSING
Language technology to seamlessly communicate in an increasingly connected world: machine translation, natural language generation, natural language understanding, language modelling, document understanding, multilingual NLP, spoken language processing.
Highlights
2023
- We have 3 papers and a demo (DisCO) at ACL 2023
- Online SIGSLT lecture on recent work in speech translation (May 23)
- Co-organising the 2023 ALPS winter school in NLP (January 16-20)
2022
- Interview with Laurent Besacier on national Radio station France Culture on What’s all the fuss about ChatGPT? (8min replay in French!)
- EMNLP 2022 – 3 papers (long, short, findings) & WMT workshop paper
- NeurIPS 2022 – paper on fine-tuning language models with no catastrophic forgetting,
- ACL 2022 – 2 long papers, 1 Findings & 3 workshop papers. Silver sponsor.
- Paper at ICLR 2022
- Co-organising the 2022 ALPS winter school in NLP (January)
- Interview with Laurent Besacier on Radio France Culture on NLP and Translation in January (replay in French!)
- Laurent Besacier Special Area Chair on Language Diversity theme track at ACL 2022
2021
- 2 papers on NMT + demo (semantics) at EMNLP, TTS paper at CoNLL and 3 papers at WMT
- Member of BigScience, 1yr project on large language models
- 2 papers at ACL 2021
- Oral paper at ICLR 2021 on ‘A distributional approach to controlled text generation’ (on openreview),
- Co-organising (Marc Dymetman, Hady Elsahar) the Energy Based Models workshop at ICLR 2021
- Paper at EACL 2021, ‘Self-supervised and controlled multi-document opinion summarization’ (arXIv)
- Hady Elsahar co-organizing the 2nd workshop on African NLP at EACL 2021
- ALPS winter school in NLP (Jan 2021). All keynotes are online!
- We continue teaching at CentraleSupelec.
Related Content
Language is the most natural and most dominant mode of communication and, arguably, one of the main visible signals of higher intelligence. At the same time, language is messy, ambiguous, multimodal and ever-changing so to decipher it you need a good amount of common-sense, contextual and cultural understanding. To fulfil our vision of seamlessly communicating with intelligent devices, existing technology and the methods used to solve natural language processing problems need to be considerably improved.
That is precisely what we do:
- As a European lab of a Korean company, we’re distinctly aware of how real the language barrier can be. We improve the current state-of-the-art in multilingual applications and machine translation, trying to find optimal tradeoffs between efficiency and performance.
- In addition, while natural language generation (NLG) models have recently progressed to a point where they can produce highly fluent texts, they can be deficient on other important levels (producing toxic or socially biased content for instance) so we augment them with explicit controls.
- As far as natural language understanding (NLU) is concerned, we address the challenge of capturing meaning beyond memorising surface patterns and co-occurrences. Our work on this topic applies to document understanding, fine-grained information extraction and spoken language understanding.
- Method-wise, we’re particularly interested in how to combine the power and flexibility of deep neural networks with the rich prior knowledge present in decades of linguistic studies and knowledge of the task at hand. We also investigate how models can continuously and adaptively learn in order to incrementally acquire increasingly more complex skills and knowledge.
More blog articles relevant to Natural Language Processing