View Source LangChain.ChatModels.ChatVertexAI (LangChain v0.4.0)
Parses and validates inputs for making a request for the Google AI Chat API.
Converts response into more specialized LangChain data structures.
Example Usage:
alias LangChain.Chains.LLMChain
alias LangChain.Message
alias LangChain.Message.ContentPart
alias LangChain.ChatModels.ChatVertexAI
config = %{
model: "gemini-2.0-flash",
api_key: ..., # vertex requires gcloud auth token https://cloud.google.com/vertex-ai/generative-ai/docs/start/quickstarts/quickstart-multimodal#rest
temperature: 1.0,
top_p: 0.8,
receive_timeout: ...
}
model = ChatVertexAI.new!(config)
%{llm: model, verbose: false, stream: false}
|> LLMChain.new!()
|> LLMChain.add_message(
Message.new_user!([
ContentPart.new!(%{type: :text, content: "Analyse the provided file and share a summary"}),
ContentPart.new!(%{
type: :file_url,
content: ...,
options: [media: ...]
})
])
)
|> LLMChain.run()
The above call will return summary of the media content.
Summary
Functions
Calls the Google AI API passing the ChatVertexAI struct with configuration, plus either a simple message or the list of messages to act as the prompt.
Return the content parts for the message.
Setup a ChatVertexAI client configuration.
Setup a ChatVertexAI client configuration and return it or raise an error if invalid.
Restores the model from the config.
Determine if an error should be retried. If true, a fallback LLM may be
used. If false, the error is understood to be more fundamental with the
request rather than a service issue and it should not be retried or fallback
to another service.
Generate a config map that can later restore the model's configuration.
Types
Functions
Calls the Google AI API passing the ChatVertexAI struct with configuration, plus either a simple message or the list of messages to act as the prompt.
Optionally pass in a list of tools available to the LLM for requesting execution in response.
NOTE: This function can be used directly, but the primary interface
should be through LangChain.Chains.LLMChain. The ChatVertexAI module is
more focused on translating the LangChain data structures to and from the
OpenAI API.
Another benefit of using LangChain.Chains.LLMChain is that it combines the
storage of messages, adding tools, adding custom context that should be passed
to tools, and automatically applying LangChain.MessageDelta structs as they
are are received, then converting those to the full LangChain.Message once
fully complete.
@spec get_message_contents(LangChain.MessageDelta.t() | LangChain.Message.t()) :: [ %{required(String.t()) => any()} ]
Return the content parts for the message.
@spec new(attrs :: map()) :: {:ok, t()} | {:error, Ecto.Changeset.t()}
Setup a ChatVertexAI client configuration.
Setup a ChatVertexAI client configuration and return it or raise an error if invalid.
Restores the model from the config.
@spec retry_on_fallback?(LangChain.LangChainError.t()) :: boolean()
Determine if an error should be retried. If true, a fallback LLM may be
used. If false, the error is understood to be more fundamental with the
request rather than a service issue and it should not be retried or fallback
to another service.
Generate a config map that can later restore the model's configuration.