Menu
in

Gemini chat: Basic LLM settings explained by Maciej Strzelczyk. #Gemini

The article discusses configuring a GenerativeModel in Google Cloud’s Vertex AI Studio for a Discord bot. It explains the parameters available for controlling the model’s behavior, such as model_name, generation_config, safety_settings, tools, tool_config, and system_instruction. The article delves into settings like Temperature, Top_p, Top_k, Candidate_count, Max_output_tokens, Stop_sequences, Presence_penalty, frequency_penalty, and Response_mime_type. It also touches on safety features and tool configuration. Additionally, it highlights the importance of human language instructions to guide the model’s responses. The article provides examples of how to use these instructions effectively. It concludes by emphasizing that not all parameters are available for every model and directs readers to official documentation for further details.

Source link

Source link: https://medium.com/google-cloud/gemini-has-entered-the-chat-understanding-basic-llm-settings-ae97b5f24cbb?source=rss——llm-5

Leave a Reply

Exit mobile version