The article discusses configuring a GenerativeModel in Google Cloud’s Vertex AI Studio for a Discord bot. It explains the parameters available for controlling the model’s behavior, such as model_name, generation_config, safety_settings, tools, tool_config, and system_instruction. The article delves into settings like Temperature, Top_p, Top_k, Candidate_count, Max_output_tokens, Stop_sequences, Presence_penalty, frequency_penalty, and Response_mime_type. It also touches on safety features and tool configuration. Additionally, it highlights the importance of human language instructions to guide the model’s responses. The article provides examples of how to use these instructions effectively. It concludes by emphasizing that not all parameters are available for every model and directs readers to official documentation for further details.
Source link
Source link: https://medium.com/google-cloud/gemini-has-entered-the-chat-understanding-basic-llm-settings-ae97b5f24cbb?source=rss——llm-5
in AI Medium
Gemini chat: Basic LLM settings explained by Maciej Strzelczyk. #Gemini
![Gemini has entered the chat: Understanding basic LLM settings | by Maciej Strzelczyk | Google Cloud - Community | Jul, 2024](https://i0.wp.com/webappia.com/wp-content/uploads/2024/07/1aVfQzVT6mM7q8XU-SqP7AQ.png?fit=758%2C226&quality=80&ssl=1)
GIPHY App Key not set. Please check settings