This paper introduces XProvence, a multilingual zero-cost context pruning model for Retrieval-Augmented Generation (RAG), trained on 16 languages and supporting 100+ languages through effective cross-lingual transfer. Motivated by the growing use of RAG systems across diverse languages, we explore several strategies to generalize the Provence framework—which first integrated efficient zero-cost context pruning directly into the re-ranking model—beyond English. Across four multilingual Question Answering benchmarks, we show how XProvence can prune RAG contexts with minimal-to-no performance degradation and outperforms strong baselines.

