Aludel.LLM (aludel v0.1.18)

Copy Markdown View Source

LLM client abstraction for multi-provider support.

Supports OpenAI, Anthropic, Google Gemini, and Ollama providers via direct HTTP calls. Provides unified interface for generating text completions and tracking usage metrics.

Returns structured responses containing:

  • Generated output text
  • Token usage (input/output)
  • Latency measurements
  • Cost estimation

Summary

Functions

Calls an LLM provider with a prompt and optional documents.

Types

document_input()

@type document_input() :: %{data: binary(), content_type: String.t()}

error_reason()

@type error_reason() ::
  :missing_api_key
  | {:auth_error, String.t()}
  | {:rate_limit, non_neg_integer() | nil}
  | {:invalid_request, String.t()}
  | {:api_error, non_neg_integer(), String.t()}
  | {:network_error, term()}

llm_result()

@type llm_result() :: %{
  output: String.t(),
  input_tokens: non_neg_integer(),
  output_tokens: non_neg_integer(),
  latency_ms: non_neg_integer(),
  cost_usd: float()
}

Functions

call(provider, prompt, opts \\ [])

@spec call(Aludel.Providers.Provider.t(), String.t(), keyword()) ::
  {:ok, llm_result()} | {:error, error_reason()}

Calls an LLM provider with a prompt and optional documents.

Parameters

  • provider: Provider configuration struct
  • prompt: Text prompt to send to the LLM
  • opts: Additional options
    • :documents - List of document maps with :data and :content_type

Returns

  • {:ok, result} with output, tokens, latency, and cost
  • {:error, reason} if the call fails

Examples

iex> provider = %Provider{provider: :openai,
...>   model: "gpt-4o"}
iex> {:ok, result} = LLM.call(provider, "Hello world")
iex> is_binary(result.output)
true

iex> provider = %Provider{provider: :openai,
...>   model: "gpt-4o"}
iex> doc = %{data: <<...>>, content_type: "image/png"}
iex> {:ok, result} = LLM.call(provider,
...>   "Describe image", documents: [doc])
iex> is_binary(result.output)
true