View Source LangChain.ChatModels.ChatGoogleAI (LangChain v0.3.0-rc.0)
Parses and validates inputs for making a request for the Google AI Chat API.
Converts response into more specialized LangChain
data structures.
Summary
Functions
Calls the Google AI API passing the ChatGoogleAI struct with configuration, plus either a simple message or the list of messages to act as the prompt.
Return the content parts for the message.
Setup a ChatGoogleAI client configuration.
Setup a ChatGoogleAI client configuration and return it or raise an error if invalid.
Restores the model from the config.
Generate a config map that can later restore the model's configuration.
Types
Functions
Calls the Google AI API passing the ChatGoogleAI struct with configuration, plus either a simple message or the list of messages to act as the prompt.
Optionally pass in a list of tools available to the LLM for requesting execution in response.
Optionally pass in a callback function that can be executed as data is received from the API.
NOTE: This function can be used directly, but the primary interface
should be through LangChain.Chains.LLMChain
. The ChatGoogleAI
module is more focused on
translating the LangChain
data structures to and from the OpenAI API.
Another benefit of using LangChain.Chains.LLMChain
is that it combines the
storage of messages, adding tools, adding custom context that should be
passed to tools, and automatically applying LangChain.MessageDelta
structs as they are are received, then converting those to the full
LangChain.Message
once fully complete.
@spec get_message_contents(LangChain.MessageDelta.t() | LangChain.Message.t()) :: [ %{required(String.t()) => any()} ]
Return the content parts for the message.
@spec new(attrs :: map()) :: {:ok, t()} | {:error, Ecto.Changeset.t()}
Setup a ChatGoogleAI client configuration.
Setup a ChatGoogleAI client configuration and return it or raise an error if invalid.
Restores the model from the config.
Generate a config map that can later restore the model's configuration.