NAVER LABS Europe seminars are open to the public. This seminar is virtual and requires registration.
Date: 6th May 2021, 4:00 pm (GMT +01.00)
Advances in TF-ranking: learning-to-rank in Tensorflow
Speaker: Michael Bendersky is a Senior Staff Software Engineer at Google Research. He is currently managing a research group focusing on applying machine learning to content search and discovery. Michael holds a Ph.D. from the University of Massachusetts Amherst, and a B.Sc. and M.Sc. from the Technion, Israel Institute of Technology. Michael co-authored over 40 conference and journal papers. He served on program and organizing committees for multiple academic conferences including SIGIR, CIKM, WSDM, WWW, KDD and ICTIR. He is the co-author of the book “Information Retrieval with Verbose Queries” in the “Foundations and Trends in Information Retrieval” series, and the co-organizer of the SIGIR 2015 tutorial on this topic. Michael co-organized popular tutorials on neural learning-to-rank at SIGIR 2019 and ICTIR 2019
Abstract: In this talk, I will introduce TF-Ranking, a popular open-source library for building learning-to-rank (LTR) models in Tensorflow that our team has developed. I will first provide an overview of standard pointwise, pairwise and listwise approaches to LTR, and how these approaches are implemented in TF-Ranking. I will also demonstrate the state-of-the-art performance of TF-Ranking on a variety of both public and large-scale proprietary datasets. I will finally focus on two recently published advances in neural ranking that are (or will be) available in TF-Ranking: (a) neural GAMs for building compact and interpretable LTR models, and (b) differentiable diversification-aware losses to optimize diversity metrics like alpha-NDCG