# `Aludel.Interfaces.LLM.Adapters.Http.Default`
[🔗](https://github.com/ccarvalho-eng/aludel/blob/main/lib/aludel/interfaces/llm/adapters/http/default.ex#L1)

Default LLM HTTP client using ReqLLM.

Uses the ReqLLM library for making LLM API calls, normalizing
responses to a generic format that hides ReqLLM-specific types.

## Telemetry

Emits the following telemetry events:

* `[:aludel, :llm, :http, :start]` - When an HTTP request begins
  - Measurements: `%{system_time: integer()}`
  - Metadata: `%{model_spec: String.t()}`

* `[:aludel, :llm, :http, :stop]` - When an HTTP request completes
  - Measurements: `%{duration: integer(), input_tokens: integer(),
    output_tokens: integer()}`
  - Metadata: `%{model_spec: String.t()}`

* `[:aludel, :llm, :http, :exception]` - When an HTTP request fails
  - Measurements: `%{duration: integer()}`
  - Metadata: `%{model_spec: String.t(), error: term()}`

# `request`

LLM-specific HTTP call using ReqLLM.

## Parameters
  - model_spec: Provider and model (e.g., "openai:gpt-4o")
  - messages: Text prompt or message list
  - opts: LLM options (api_key, temperature, max_tokens, documents)

## Returns
  - `{:ok, %{content: String.t(), input_tokens: integer(),
    output_tokens: integer()}}`
  - `{:error, reason}`

---

*Consult [api-reference.md](api-reference.md) for complete listing*
