LLMAgent.Providers.Anthropic (llm_agent v0.2.0)
View SourceAnthropic provider implementation for LLMAgent.
This module implements the LLM provider interface for Anthropic, handling API calls, response parsing, and error handling specific to Anthropic's API.
Summary
Functions
Makes an Anthropic chat completion API call.
Generates embeddings for the provided text using a compatible model.
Functions
Makes an Anthropic chat completion API call.
Parameters
params
- A map containing the request parameters
Returns
{:ok, response}
- On success, returns the parsed response{:error, reason}
- On failure, returns the error reason
Generates embeddings for the provided text using a compatible model.
Note: Anthropic doesn't provide a dedicated embedding API, so this implementation uses a third-party compatible service or delegates to OpenAI's embedding API.
Parameters
params
- A map with parameters for the request:input
- The text to generate embeddings forprovider
- The embedding provider to use (default: :openai)
Returns
{:ok, embeddings}
on success, {:error, reason}
on failure.
Examples
iex> params = %{
...> input: "Hello, world!",
...> provider: :openai
...> }
iex> {:ok, embeddings} = LLMAgent.Providers.Anthropic.embedding(params)
iex> is_list(embeddings)
true