Gemini.Chat (GeminiEx v0.8.2)
View SourceFormalized chat session management with immutable history updates.
This module provides a robust, immutable approach to managing multi-turn conversations with the Gemini API, including proper handling of tool-calling turns with function calls and responses.
Usage
# Create a new chat session
chat = Gemini.Chat.new(model: "gemini-flash-lite-latest", temperature: 0.7)
# Add turns to the conversation
chat = chat
|> Gemini.Chat.add_turn("user", "What's the weather like?")
|> Gemini.Chat.add_turn("model", [%Altar.ADM.FunctionCall{...}])
|> Gemini.Chat.add_turn("user", [%Altar.ADM.ToolResult{...}])
|> Gemini.Chat.add_turn("model", "Based on the weather data...")
# Generate content with the chat history
{:ok, response} = Gemini.generate_content(chat.history, chat.opts)
Summary
Functions
Add a model response to the chat history, extracting any thought signatures.
Add a turn to the chat history.
Create a new chat session with optional configuration.
Types
@type t() :: %Gemini.Chat{ history: [Gemini.Types.Content.t()], last_signatures: [String.t()], opts: keyword() }
A chat session containing conversation history and configuration options.
Functions
@spec add_model_response(t(), Gemini.Types.Response.GenerateContentResponse.t()) :: t()
Add a model response to the chat history, extracting any thought signatures.
This function automatically extracts thought signatures from the response and stores them for echoing in the next user message.
Parameters
chat: Current chat sessionresponse: GenerateContentResponse from the API
Returns
Updated chat with the model's response added and signatures stored.
Examples
{:ok, response} = Gemini.generate("Hello", model: "gemini-3-pro-preview")
chat = Chat.add_model_response(chat, response)
# chat.last_signatures contains any signatures from the response
@spec add_turn( t(), String.t(), String.t() | [map()] | [Altar.ADM.FunctionCall.t()] | [Altar.ADM.ToolResult.t()] ) :: t()
Add a turn to the chat history.
This function handles different types of content based on the role and message type:
- User text messages:
add_turn(chat, "user", "Hello") - Model text responses:
add_turn(chat, "model", "Hi there!") - Model function calls:
add_turn(chat, "model", [%FunctionCall{...}]) - User function responses:
add_turn(chat, "user", [%ToolResult{...}])
For user messages, if there are stored thought signatures from the previous model response, they will be automatically attached to the user's message part.
Returns a new chat struct with the updated history, preserving immutability.
Create a new chat session with optional configuration.
Options
All standard Gemini API options are supported:
:model- Model name (defaults to configured default):temperature- Generation temperature (0.0-1.0):max_output_tokens- Maximum tokens to generate:generation_config- Full GenerationConfig struct:safety_settings- List of SafetySetting structs:system_instruction- System instruction content- And more...
Examples
chat = Gemini.Chat.new()
chat = Gemini.Chat.new(model: "gemini-2.5-pro", temperature: 0.3)