Abstract
Zae Myung Kim, Laurent Besacier, Vassilina Nikoulina, Didier Schwab |
Findings of the Annual Meeting of the Association for Computation Linguistics (ACL) 2021, virtual event, 1-6 August, 2021 |
Download |
Abstract
Recent studies on the analysis of the multilingual representations focus on identifying whether there is an emergence of languageindependent representations, or whether a multilingual model partitions its weights among different languages. While most of such work has been conducted in a “black-box” manner, this paper aims to analyze individual components of a multilingual neural translation (NMT) model. In particular, we look at the encoder self-attention and encoder-decoder attention heads (in a many-to-one NMT model) that are more specific to the translation of a certain language pair than others by (1) employing metrics that quantify some aspects of the attention weights such as “variance” or “confidence”, and (2) systematically ranking the importance of attention heads with respect to translation quality. Experimental results show that surprisingly, the set of most important attention heads are very similar across the language pairs and that it is possible to remove nearly one-third of the less important heads without hurting the translation quality greatly.
Details on the gender equality index score 2023 (related to year 2022) for NAVER France of 81/100.
NAVER France targets are as follows:
——————-
Index NAVER France de l’égalité professionnelle entre les femmes et les hommes pour l’année 2023 au titre des données 2022 : 81/100
Détail des indicateurs :
Les objectifs de progression de NAVER France sont :
NAVER LABS Europe 6-8 chemin de Maupertuis 38240 Meylan France Contact
This web site uses cookies for the site search, to display videos and for aggregate site analytics.
Learn more about these cookies in our privacy notice.
You may choose which kind of cookies you allow when visiting this website. Click on "Save cookie settings" to apply your choice.
FunctionalThis website uses functional cookies which are required for the search function to work and to apply for jobs and internships.
AnalyticalOur website uses analytical cookies to make it possible to analyse our website and optimize its usability.
Social mediaOur website places social media cookies to show YouTube and Vimeo videos. Cookies placed by these sites may track your personal data.
This content is currently blocked. To view the content please either 'Accept social media cookies' or 'Accept all cookies'.
For more information on cookies see our privacy notice.