in

Gemini chat: Basic LLM settings explained by Maciej Strzelczyk. #Gemini

Gemini has entered the chat: Understanding basic LLM settings | by Maciej Strzelczyk | Google Cloud - Community | Jul, 2024

The article discusses configuring a GenerativeModel in Google Cloud’s Vertex AI Studio for a Discord bot. It explains the parameters available for controlling the model’s behavior, such as model_name, generation_config, safety_settings, tools, tool_config, and system_instruction. The article delves into settings like Temperature, Top_p, Top_k, Candidate_count, Max_output_tokens, Stop_sequences, Presence_penalty, frequency_penalty, and Response_mime_type. It also touches on safety features and tool configuration. Additionally, it highlights the importance of human language instructions to guide the model’s responses. The article provides examples of how to use these instructions effectively. It concludes by emphasizing that not all parameters are available for every model and directs readers to official documentation for further details.

Source link

Source link: https://medium.com/google-cloud/gemini-has-entered-the-chat-understanding-basic-llm-settings-ae97b5f24cbb?source=rss——llm-5

What do you think?

Leave a Reply

GIPHY App Key not set. Please check settings

✨Pipedrive AI✨ - For smarter, faster sales

Top 10 Lead-Capture Software Solutions for July 2024 #LeadCapture

Two boys dressed as pirates playing in classroom

The importance of play in fostering deep childhood learning #PlayBasedLearning