View Source LangChain.ChatModels.ChatGoogleAI (LangChain v0.3.3)

Parses and validates inputs for making a request for the Google AI Chat API.

Converts response into more specialized LangChain data structures.

NOTE: The GoogleAI service is unique in how it reports TokenUsage information. So far, it's the only API that returns TokenUsage for each returned delta, where the generated token count is incremented with one. Other services return the total TokenUsage data at the end. This Chat model fires the callback each time it is received.

Google Search Integration

Starting with Gemini 2.0, this module supports Google Search as a native tool, allowing the model to automatically search the web for recent information to ground its responses and improve factuality. Check out the Google AI Documentation for more information.

Example Usage:

alias LangChain.Chains.LLMChain
alias LangChain.Message
alias LangChain.NativeTool

model = ChatGoogleAI.new!(%{temperature: 0, stream: false, model: "gemini-2.0-flash"})

{:ok, updated_chain} =
   %{llm: model, verbose: false, stream: false}
   |> LLMChain.new!()
   |> LLMChain.add_message(
     Message.new_user!("What is the current Google stock price?")
   )
   |> LLMChain.add_tools(NativeTool.new!(%{name: "google_search", configuration: %{}}))
   |> LLMChain.run()

The above call will return the current Google stock price.

When google_search is used, the model will also return grounding information in the metadata attribute of the assistant message.

Summary

Functions

Calls the Google AI API passing the ChatGoogleAI struct with configuration, plus either a simple message or the list of messages to act as the prompt.

Return the content parts for the message.

Setup a ChatGoogleAI client configuration.

Setup a ChatGoogleAI client configuration and return it or raise an error if invalid.

Restores the model from the config.

Generate a config map that can later restore the model's configuration.

Types

@type t() :: %LangChain.ChatModels.ChatGoogleAI{
  api_key: term(),
  api_version: term(),
  callbacks: term(),
  endpoint: term(),
  json_response: term(),
  json_schema: term(),
  model: term(),
  receive_timeout: term(),
  safety_settings: term(),
  stream: term(),
  temperature: term(),
  top_k: term(),
  top_p: term()
}

Functions

Link to this function

call(google_ai, prompt, tools \\ [])

View Source

Calls the Google AI API passing the ChatGoogleAI struct with configuration, plus either a simple message or the list of messages to act as the prompt.

Optionally pass in a list of tools available to the LLM for requesting execution in response.

Optionally pass in a callback function that can be executed as data is received from the API.

NOTE: This function can be used directly, but the primary interface should be through LangChain.Chains.LLMChain. The ChatGoogleAI module is more focused on translating the LangChain data structures to and from the Google AI API.

Another benefit of using LangChain.Chains.LLMChain is that it combines the storage of messages, adding tools, adding custom context that should be passed to tools, and automatically applying LangChain.MessageDelta structs as they are are received, then converting those to the full LangChain.Message once fully complete.

Link to this function

complete_final_delta(data)

View Source
Link to this function

do_process_response(model, response, message_type \\ Message)

View Source
Link to this function

for_api(google_ai, messages, functions)

View Source
Link to this function

get_message_contents(message)

View Source
@spec get_message_contents(LangChain.MessageDelta.t() | LangChain.Message.t()) :: [
  %{required(String.t()) => any()}
]

Return the content parts for the message.

@spec new(attrs :: map()) :: {:ok, t()} | {:error, Ecto.Changeset.t()}

Setup a ChatGoogleAI client configuration.

@spec new!(attrs :: map()) :: t() | no_return()

Setup a ChatGoogleAI client configuration and return it or raise an error if invalid.

Restores the model from the config.

@spec serialize_config(t()) :: %{required(String.t()) => any()}

Generate a config map that can later restore the model's configuration.