Energy-Based Models for Controlled Text Generation - Internship - Naver Labs Europe
5 November 2020
Meylan, Grenoble, France, France
Start date
January 2021
5-6 months


Neural language generation models, with the advent of the Transformer architecture [1], have recently progressed to a point where they can produce highly fluent texts. However, they can be deficient on other important levels: semantic consistency, faithfulness to the facts, toxic or socially biased content.


In order to avoid such problems, we propose to augment these generators with explicit controls that operate at the level of global text sequences, complementing the local training regime of autoregressive models. 


Energy-Based Models (EBM) [2,3] provide a natural framework for exploiting such controls and have started to be effective in this respect [4]. However, certain challenges remain: (1) how to increase the ability of such models to efficiently generate texts that fully satisfy the desired controls; (2) how to quickly adapt to dynamic control specifications, without incurring a costly full re-training of the models.


We are looking for a motivated intern to help us develop techniques and algorithms addressing these challenges. Experiments will be conducted on selected text generation tasks, in particular in the context of open-text generation over pre-trained models such as GPT-2/3.


The successful candidate should be enrolled in a graduate program, at the Master or (preferably) PhD level. 

The intern will work with Hady Elsahar, Marc Dymetman, Germán Kruszewski 

Publication in major conferences/journals will be strongly encouraged.

Required skills

- Strong programming skills
- Relevant experience with training Deep Learning models for NLP
- Strong mathematical skills
- Ability to communicate research

Optional skills

- Knowledge of MCMC sampling techniques and/or Reinforcement Learning.
- Publications at peer-reviewed AI conferences


[1] Vaswani et al. Attention is all you need, NIPS 2017

[2] LeCun et al. A tutorial on energy-based learning, 2006

[3] Bakhtin et al. Energy-based models for text, 2020

[4] Anonymous, A Distributional Approach to Controlled Text Generation, under submission to ICLR-2021

Application instructions

You can apply for this position online. Don't forget to upload your CV and cover letter before you submit. Incomplete applications will not be accepted.
Due to the changing travel restrictions related to COVID-19, it may not be possible to host candidates from certain regions. This will depend on the conditions at the specific starting date of the internship.


NAVER LABS Europe has full-time positions, PhD and PostDoc opportunities throughout the year which are advertised here and on international conference sites that we sponsor such as CVPR, ICCV, ICML, NeurIPS, EMNLP etc.

NAVER LABS Europe is an equal opportunity employer.

NAVER LABS are in Grenoble in the French Alps. We have a multi and interdisciplinary approach to research with scientists in machine learning, computer vision, artificial intelligence, natural language processing, ethnography and UX working together to create next generation ambient intelligence technology and services that deeply understand users and their contexts.

Apply to this internship
Drop files here browse files ...
Drop files here browse files ...
Drop files here browse files ...

Related Jobs

25 November 2020
Full Body 3D Human Pose in the Wild - Internship   Meylan, Grenoble, France, France
19 November 2020
16 November 2020
Learning to grasp as a human demonstration - Internship   Meylan, Grenoble, France, France
16 November 2020
9 November 2020
Are you sure you want to delete this file?