Joint Semantic and Distributional Word Representations with Multi-Graph Embeddings - Naver Labs Europe
loader image


Word embeddings continue to be of great use for NLP researchers and practitioners due to their training speed and easiness of use and distribution. Prior work has shown that the representation of those words can be improved by the use of semantic knowledge-bases. In this paper we propose a novel way of combining those knowledge-bases with the lexical information of co-occurrences of words remains. It is conceptually clear, as it consists in mapping all information into a multigraph and modifying existing node embeddings techniques to compute word representation. Our experiments show improved results than vanilla word embeddings and retrofitting techniques using the same information, on a variety of data-sets of word similarities.