Aludel.Interfaces.LLM.Adapters.Http.Default (aludel v0.2.0)

Copy Markdown View Source

Default LLM HTTP client using ReqLLM.

Uses the ReqLLM library for making LLM API calls, normalizing responses to a generic format that hides ReqLLM-specific types.

Telemetry

Emits the following telemetry events:

  • [:aludel, :llm, :http, :start] - When an HTTP request begins

    • Measurements: %{system_time: integer()}
    • Metadata: %{model_spec: String.t()}
  • [:aludel, :llm, :http, :stop] - When an HTTP request completes

    • Measurements: %{duration: integer(), input_tokens: integer(), output_tokens: integer()}
    • Metadata: %{model_spec: String.t()}
  • [:aludel, :llm, :http, :exception] - When an HTTP request fails

    • Measurements: %{duration: integer()}
    • Metadata: %{model_spec: String.t(), error: term()}

Summary

Functions

LLM-specific HTTP call using ReqLLM.

Functions

request(model_spec, messages, opts)

LLM-specific HTTP call using ReqLLM.

Parameters

  • model_spec: Provider and model (e.g., "openai:gpt-4o")
  • messages: Text prompt or message list
  • opts: LLM options (api_key, temperature, max_tokens, documents)

Returns

  • {:ok, %{content: String.t(), input_tokens: integer(), output_tokens: integer()}}
  • {:error, reason}