in

Customize LLMs quickly with RAG and your own data. #AIprogramming

Add RAG to any LLM with your Custom Data within 10 minutes | by Mohit Dulani | Jun, 2024

In this article, the author shares a template for integrating RAG in a LLM to take advantage of a low context window. The process involves importing required libraries, obtaining the LLM model (using Groq free API and mixtral-8x7b models), setting up a vector-store for custom datasets (using FAISS open source vector DB), and understanding how to input context properly. The author uses the unstructured python library to get data, which can also be inputted from a txt file. A function is created to return an output from the LLM considering the context extracted from the vector-store.

The system prompt involves providing a query or prompt, which is passed to the retriever to retrieve the most probable outputs from the vector store. The context size is calculated based on the retrieved outputs and the query prompt. The author explains the context window size of LLM models and suggests using a 7k context window model for better results.

The author encourages readers to visit the GitHub link for updated code and to resolve any issues. They also invite feedback and contributions to the project. Overall, the article provides a detailed guide on integrating RAG in a LLM with a custom dataset and offers insights into optimizing the context window for better performance.

Source link

Source link: https://medium.com/@mohitdulani/add-rag-to-any-llm-with-your-custom-data-within-10-minutes-ce4772fc8642?source=rss——ai-5

What do you think?

Leave a Reply

GIPHY App Key not set. Please check settings

An Alternative to Conventional Neural Networks Could Help Reveal What AI Is Doing behind the Scenes

Parallel Domain launches PD Replica for autonomous vehicle testing. #DigitalTwins

Google Mesop: Is it that Easy to Create User Interface for AI Apps?

Creating User Interface for AI Apps with Google Mesop #AIUI