View Source LangChain.ChatModels.ChatAnthropic (LangChain v0.3.0-rc.0)

Module for interacting with Anthropic models.

Parses and validates inputs for making requests to Anthropic's messages API.

Converts responses into more specialized LangChain data structures.

Callbacks

See the set of available callback: LangChain.ChatModels.LLMCallbacks

Rate Limit API Response Headers

Anthropic returns rate limit information in the response headers. Those can be accessed using an LLM callback like this:

handlers = %{
  on_llm_ratelimit_info: fn _model, headers ->
    IO.inspect(headers)
  end
}

{:ok, chat} = ChatAnthropic.new(%{callbacks: [handlers]})

When a request is received, something similar to the following will be output to the console.

%{
  "anthropic-ratelimit-requests-limit" => ["50"],
  "anthropic-ratelimit-requests-remaining" => ["49"],
  "anthropic-ratelimit-requests-reset" => ["2024-06-08T04:28:30Z"],
  "anthropic-ratelimit-tokens-limit" => ["50000"],
  "anthropic-ratelimit-tokens-remaining" => ["50000"],
  "anthropic-ratelimit-tokens-reset" => ["2024-06-08T04:28:30Z"],
  "request-id" => ["req_1234"]
}

Summary

Functions

Calls the Anthropic API passing the ChatAnthropic struct with configuration, plus either a simple message or the list of messages to act as the prompt.

Convert a LangChain structure to the expected map of data for the OpenAI API.

Return the params formatted for an API request.

Setup a ChatAnthropic client configuration.

Setup a ChatAnthropic client configuration and return it or raise an error if invalid.

After all the messages have been converted using for_api/1, this combines multiple sequential tool response messages. The Anthropic API is very strict about user, assistant, user, assistant sequenced messages.

Restores the model from the config.

Generate a config map that can later restore the model's configuration.

Types

@type t() :: %LangChain.ChatModels.ChatAnthropic{
  api_key: term(),
  api_version: term(),
  callbacks: term(),
  endpoint: term(),
  max_tokens: term(),
  model: term(),
  receive_timeout: term(),
  stream: term(),
  temperature: term(),
  top_k: term(),
  top_p: term()
}

Functions

Link to this function

call(anthropic, prompt, functions \\ [])

View Source

Calls the Anthropic API passing the ChatAnthropic struct with configuration, plus either a simple message or the list of messages to act as the prompt.

Optionally pass in a callback function that can be executed as data is received from the API.

NOTE: This function can be used directly, but the primary interface should be through LangChain.Chains.LLMChain. The ChatAnthropic module is more focused on translating the LangChain data structures to and from the Anthropic API.

Another benefit of using LangChain.Chains.LLMChain is that it combines the storage of messages, adding functions, adding custom context that should be passed to functions, and automatically applying LangChain.MessageDelta structs as they are are received, then converting those to the full LangChain.Message once fully complete.

@spec for_api(
  LangChain.Message.t()
  | LangChain.Message.ContentPart.t()
  | LangChain.Function.t()
) ::
  %{required(String.t()) => any()} | no_return()

Convert a LangChain structure to the expected map of data for the OpenAI API.

Link to this function

for_api(anthropic, messages, tools)

View Source
@spec for_api(t(), message :: [map()], LangChain.ChatModels.ChatModel.tools()) :: %{
  required(atom()) => any()
}

Return the params formatted for an API request.

@spec new(attrs :: map()) :: {:ok, t()} | {:error, Ecto.Changeset.t()}

Setup a ChatAnthropic client configuration.

@spec new!(attrs :: map()) :: t() | no_return()

Setup a ChatAnthropic client configuration and return it or raise an error if invalid.

Link to this function

post_process_and_combine_messages(messages)

View Source

After all the messages have been converted using for_api/1, this combines multiple sequential tool response messages. The Anthropic API is very strict about user, assistant, user, assistant sequenced messages.

Restores the model from the config.

@spec serialize_config(t()) :: %{required(String.t()) => any()}

Generate a config map that can later restore the model's configuration.