0:00
/
0:00
Transcript

How to implement Generative Retrieval

GenAI meets recommender systems

Improving Recsys with GenAI

We're excited about the potential of Large Language Models (LLMs) in recommender systems, given their high accuracy in multiple domains. Building on this potential, we'll explore how to harness LLMs for recommendation tasks.

The Challenge of Using LLMs in RecSys

One key challenge is tokenizing billions of recommendable items, making it hard to apply LLMs directly. If only we could break it into fewer tokens like LLMs do to long words like "“happiness”. Once we have a vocabulary of meaningful tokens that reliably describe interaction probabilities, we can leverage LLM machinery for prediction.

Proposed Solution: Generative Retrieval

The paper Generative Retrieval generates "semantic" embeddings using RQVAE (a type of vector quantized variational autoencoder), enabling LLMs to learn meaningful item representations. By creating semantic embeddings where similar items are closer together, one can generate semantic IDs that capture nuanced relationships between items.

Showing You How to Implement It

To make this approach more accessible, Samson Komo has prepared for you:

A video tutorial walking through paper code and colab:

GitHub repo: https://github.com/komosam/Generative-Retrieval

Street Cred of the Approach

Generative retrieval has already shown impressive results, with over 40% share of retrieval in some state-of-the-art video and ad recommender systems. By implementing this method, you can unlock more accurate and diverse recommendations.

Conclusion

By implementing generative retrieval, you can tap into the power of LLMs for recommendation tasks. Explore our resources to get started and discover how this approach can enhance your recommender systems.

Disclaimer: These are the personal opinions of the author(s). Any assumptions, opinions stated here are theirs and not representative of their current or any prior employer(s). Apart from publicly available information, any other information here is not claimed to refer to any company including ones the author(s) may have worked in or been associated with.

Discussion about this video

User's avatar