ReqLLM.Providers.OpenRouter (ReqLLM v1.0.0)

View Source

OpenRouter provider – OpenAI Chat Completions compatible with OpenRouter's unified API.

Implementation

Uses built-in OpenAI-style encoding/decoding defaults. No custom wrapper modules – leverages the standard OpenAI-compatible implementations.

OpenRouter-Specific Extensions

Beyond standard OpenAI parameters, OpenRouter supports:

  • openrouter_models - Array of model IDs for routing/fallback preferences
  • openrouter_route - Routing strategy (e.g., "fallback")
  • openrouter_provider - Provider preferences object for routing decisions
  • openrouter_transforms - Array of prompt transforms to apply
  • openrouter_top_k - Top-k sampling (not available for OpenAI models)
  • openrouter_repetition_penalty - Repetition penalty for reducing repetitive text
  • openrouter_min_p - Minimum probability threshold for sampling
  • openrouter_top_a - Top-a sampling parameter
  • app_referer - HTTP-Referer header for app identification
  • app_title - X-Title header for app title in rankings

App Attribution Headers

OpenRouter supports optional headers for app discoverability:

  • Set HTTP-Referer header for app identification
  • Set X-Title header for app title in rankings

See provider_schema/0 for the complete OpenRouter-specific schema and ReqLLM.Provider.Options for inherited OpenAI parameters.

Configuration

# Add to .env file (automatically loaded)
OPENROUTER_API_KEY=sk-or-...

Summary

Functions

Default implementation of attach/3.

Default implementation of attach_stream/4.

Default implementation of decode_response/1.

Default implementation of decode_stream_event/2.

Custom body encoding that adds OpenRouter-specific extensions to the default OpenAI-compatible format.

Default implementation of extract_usage/2.

Custom prepare_request for :object operations to maintain OpenRouter-specific max_tokens handling.

Default implementation of translate_options/3.

Functions

attach(request, model_input, user_opts)

Default implementation of attach/3.

Sets up Bearer token authentication and standard pipeline steps.

attach_stream(model, context, opts, finch_name)

Default implementation of attach_stream/4.

Builds complete streaming requests using OpenAI-compatible format.

decode_response(request_response)

Default implementation of decode_response/1.

Handles success/error responses with standard ReqLLM.Response creation.

decode_stream_event(event, model)

Default implementation of decode_stream_event/2.

Decodes SSE events using OpenAI-compatible format.

default_base_url()

default_env_key()

Callback implementation for ReqLLM.Provider.default_env_key/0.

default_provider_opts()

encode_body(request)

Custom body encoding that adds OpenRouter-specific extensions to the default OpenAI-compatible format.

Adds support for OpenRouter routing and sampling parameters:

  • models (routing preferences)
  • route (routing strategy)
  • provider (provider preferences)
  • transforms (prompt transforms)
  • top_k, repetition_penalty, min_p, top_a (sampling parameters)
  • top_logprobs (log probabilities)

Also handles OpenRouter-specific app attribution headers:

  • HTTP-Referer header for app identification
  • X-Title header for app title in rankings

extract_usage(body, model)

Default implementation of extract_usage/2.

Extracts usage data from standard usage field in response body.

metadata()

prepare_request(operation, model_spec, input, opts)

Custom prepare_request for :object operations to maintain OpenRouter-specific max_tokens handling.

Ensures that structured output requests have adequate token limits while delegating other operations to the default implementation.

provider_extended_generation_schema()

provider_id()

provider_schema()

supported_provider_options()

translate_options(operation, model, opts)

Default implementation of translate_options/3.

Pass-through implementation that returns options unchanged.