in

Selecting the Best Embedding for RAG in Generative AI #EmbeddingSelection

Choosing the Right Embedding for RAG in Generative AI | by Shivika K Bisen | Bright ML | Jul, 2024

The content discusses the importance of embeddings in Retrieval Augmented Generation (RAG) and compares different types of embeddings with examples. It explains how static embeddings generate fixed vector representations for words, while contextual embeddings produce different vectors based on context. Examples illustrate how different embeddings handle queries and logs to retrieve relevant information. Limitations of each type of embedding are also discussed. The comparison includes BERT, RoBERTa, SBERT, and MPNet embeddings, highlighting their bidirectional and focused context capabilities. The content further discusses generative-based embeddings like GPT and their ability to generate responses based on broader context. Limitations of generative models are also mentioned. The article concludes with a comparison summary of various embeddings based on the MTEB retrieval score and latency. Resources for further reading on the topic are provided.

Source link

Source link: https://medium.com/bright-ml/choosing-the-right-embedding-for-rag-in-generative-ai-applications-8cf5b36472e1?source=rss——artificial_intelligence-5

What do you think?

Leave a Reply

GIPHY App Key not set. Please check settings

Morgan Stanley

Morgan Stanley launches gen AI for global analysis #finance

21 Powerful Use Cases for Claude 3.5 Sonnet (FREE)

#21 Powerful Use Cases for Claude 3.5 Sonnet (FREE) #Claude35Sonnet