A Systematic Review of the State of RAG, Are Multimodal Embeddings Truly Beneficial for Recommendation? and More!
Vol.117 for Aug 11 - Aug 17, 2025
Stay Ahead of the Curve with the Latest Advancements and Discoveries in Information Retrieval.
This week’s newsletter highlights the following research:
Learning When and How to Parallelize Information Retrieval in LLM Reasoning, from NVIDIA
A Systematic Study of Whether Multimodal Embeddings Are Truly Beneficial for Recommendation, from Ye et al.
A Unified Training and Inference Paradigm for Efficient Recommendation Systems, from Meta
Improving Dense Retrieval Consistency Across Semantically Equivalent Queries, from Amazon
A Data-Driven Approach to Cold-Warm Transition Points in Recommender Systems, from AIRI
A Quantile-Based Approach for Optimizing Top-K Ranking Metrics in Recommender Systems, from Yang et al.
Eliminating Pre-built Graphs through Adaptive Logic-Guided Retrieval for RAG, from PolyU
Profile-Aware LLM-as-a-Judge for Podcast Recommendations, from Spotify
Efficient Transformer Architecture for Scalable Generative Recommendation Models, from Ye et al.
A Systematic Literature Review of Retrieval-Augmented Generation, from The Queen's University of Belfast
Keep reading with a 7-day free trial
Subscribe to Top Information Retrieval Papers of the Week to keep reading this post and get 7 days of free access to the full post archives.