NAVER LABS Europe seminars are open to the public. This seminar is virtual and requires registration
Date: 14th March 2023, 10:00 am (CET)
Generative IR and diffusion for recommendation
Abstract: Generative Information Retrieval (a.k.a. Generative Neural Search or chatGPT + attribution + no-hallucination) has experienced substantial growth across multiple research communities and has been highly visible in the popular press. Theoretical, empirical, and actual user-facing products have been released that retrieve documents (via generation) (Generative Document Retrieval) or directly generate answers given an input request (Grounded Answer Generation).
A subfield of Generative IR, Generative Recommendations, is still in its infancy. We propose RecFusion to use diffusion models to generate recommendations. We benchmark classical diffusion formulations (normal distribution for the forward and backward diffusion process, Unets and ELBO) against formulations fitted to the RecSys setting: 1D diffusion (user-by-user), binomial diffusion and multinomial loss (like in MultVAE). We also experiment with diffusion guidance to condition the generation of recommendation strips on movie genre (a.k.a. controllable recommendation).
About the speaker: Gabriel Bendedict: As a PhD student for the University of Amsterdam & RTL, I am doing a mix of theoretical and applied AI research. The main themes are metrics-as-losses for neural networks, normative diversity metrics for news recommendation, intent-satisfaction modelling and video-to-music AI ProsAIc.