Build Lobe-chat and Connect Ollama Models: Free Software #AItech

CA Amit Singh

Summarise this content to 300 words

Lobe-chat:an open-source, modern-design LLMs/AI chat framework. Supports Multi AI Providers( OpenAI / Claude 3 / Gemini / Ollama / Bedrock / Azure / Mistral / Perplexity ), Multi-Modals (Vision/TTS) and plugin system. One-click FREE deployment of your private ChatGPT chat application.

Ollama: Large Language model runner

How to Build lobe-chat from source.

*Please note that you need to have nodejs installed in your system before proceeding further.*

Step 01: Clone github repository with below command

git clone

Step 02: Enter into lobe-chat directory

Step 03: Install PNPM with below command

curl -fsSL | sh -

Step 04: Install lobe-chat with below command

pnpm install

Step 05: Run lobe-chat with below command and once it is up the you can view lobe-chat at http://localhost:3010

pnpm dev

Step 06: Click on settings

Step 07: Click on App settings

Step 08: Click on Language model

Step 09: Click on already added Ollama models and then click on get list

Step 10: Click on check to check connectivity

Step 11: Add new model from updated list

Step 12: Now click to select the model for conversation.

Step 13: Write your question

Step 14: Get your answer.

Here is Youtube Video for Visual reference (Only for adding Ollama Models in Lobe-chat GUI, Building Lobe-chat video is not included).

Source link

Source link:——large_language_models-5

What do you think?

Leave a Reply

GIPHY App Key not set. Please check settings

Google's Stance On AI Translations & Content Drafting Tools

Google’s position on AI translations and content drafting tools. #technology

Self-Hosting AI Models Made Easy!

#EasySelfHostingAIModels – Simplifying the process of self-hosting AI models for users.