LlmCore.LLM.Anthropic (llm_core v0.3.0)

Copy Markdown View Source

Anthropic Claude API provider implementing LlmCore.LLM.Provider.

Summary

Functions

Checks if the Anthropic API key is configured.

Returns Anthropic's capability map including streaming, structured output, tool use, and supported models.

Returns :api — Anthropic is a cloud API provider.

Sends a prompt to the Anthropic Messages API and returns the response.

Streams a response from the Anthropic Messages API using Server-Sent Events.

Functions

available?()

@spec available?() :: boolean()

Checks if the Anthropic API key is configured.

capabilities()

@spec capabilities() :: LlmCore.LLM.Provider.capabilities()

Returns Anthropic's capability map including streaming, structured output, tool use, and supported models.

provider_type()

@spec provider_type() :: :api

Returns :api — Anthropic is a cloud API provider.

send(prompt, opts \\ [])

@spec send(
  LlmCore.LLM.Provider.prompt(),
  keyword()
) :: {:ok, LlmCore.LLM.Response.t()} | {:error, LlmCore.LLM.Error.t()}

Sends a prompt to the Anthropic Messages API and returns the response.

When opts[:tools] contains a list of LlmToolkit.Tool structs, tool definitions are encoded into the request body. If the model responds with stop_reason: "tool_use", the returned Response.tool_calls will contain decoded LlmToolkit.Tool.Call structs.

stream(prompt, opts \\ [])

@spec stream(
  LlmCore.LLM.Provider.prompt(),
  keyword()
) :: {:ok, Enumerable.t()} | {:error, LlmCore.LLM.Error.t()}

Streams a response from the Anthropic Messages API using Server-Sent Events.