Semantic Retrieval at Walmart, Text Embeddings in the Era of Large Language Models, and More!
Vol.82 for Dec 09 - Dec 15, 2024
Stay Ahead of the Curve with the Latest Advancements and Discoveries in Information Retrieval.
This week’s newsletter highlights the following research:
An Efficient Two-Tower BERT Architecture for Large-Scale Product Search, from Walmart
A Unified Framework for Text Embeddings in the Age of Large Language Models, from Nie et al.
Arctic-Embed 2.0: Breaking the English-Multilingual Performance Trade-off in Text Embeddings, from Snowflake
JINA-CLIP-V2: A Three-Stage Framework for Cross-Modal and Text-Only Retrieval, from Jina AI
A Comprehensive Framework for Evaluating Document Parsing Systems, from Shanghai AI Laboratory
Amortized Inference for User History Modeling in Recommendation Systems, from LinkedIn
Leveraging Natural Language Preferences for Personalized Sequential Recommendations, from Meta
Exploring the Scaling Behavior of Transformer-based Sequential Recommendation Models, from Mercado Libre
Hierarchical Embedding Alignment Loss for Enhanced Retrieval and Representation Learning in RAG Systems, from Bhattarai et al.
Keep reading with a 7-day free trial
Subscribe to Top Information Retrieval Papers of the Week to keep reading this post and get 7 days of free access to the full post archives.