Built-in LLM adapter using req_llm.
Routes requests based on model prefix:
ollama:model-name→ Local Ollama serveropenai-compat:base_url|model→ Any OpenAI-compatible API*→ ReqLLM (OpenRouter, Anthropic, Google, Bedrock, etc.)
Requires {:req_llm, "~> 1.2"} as a dependency.
Prompt Caching
When cache: true is set in the request, prompt caching is enabled for
supported providers (Anthropic direct, OpenRouter Anthropic, Bedrock Claude).
Bedrock Region
For Bedrock models, the region is determined in this order:
AWS_REGIONenvironment variableconfig :ptc_runner, :bedrock_region, "region-name"- Default:
"eu-north-1"
Summary
Functions
Check if a provider is available.
Generate embeddings for text input.
Generate embeddings, raising on error.
Generate a structured JSON object from an LLM.
Generate a structured JSON object, raising on error.
Generate text from an LLM.
Generate text, raising on error.
Generate text with tool definitions.
Check if the model requires an API key.
Functions
Check if a provider is available.
For Ollama, checks if the server is reachable. For ReqLLM providers, checks if the required API key is set.
@spec embed(String.t(), String.t() | [String.t()], keyword()) :: {:ok, [float()] | [[float()]]} | {:error, term()}
Generate embeddings for text input.
Returns
{:ok, [float()]}for single input{:ok, [[float()]]}for batch input
Generate embeddings, raising on error.
Generate a structured JSON object from an LLM.
Only supported for ReqLLM providers. Local providers return
{:error, :structured_output_not_supported}.
Generate a structured JSON object, raising on error.
@spec generate_text(String.t(), [map()], keyword()) :: {:ok, PtcRunner.LLM.response()} | {:error, term()}
Generate text from an LLM.
Options
:receive_timeout- Request timeout in ms (default: 120000):ollama_base_url- Override Ollama server URL:cache- Enable prompt caching for supported providers (default: false)
@spec generate_text!(String.t(), [map()], keyword()) :: PtcRunner.LLM.response()
Generate text, raising on error.
@spec generate_with_tools(String.t(), [map()], [map()], keyword()) :: {:ok, map()} | {:error, term()}
Generate text with tool definitions.
Passes tools to the LLM provider. If the LLM returns tool calls,
they are included in the response as tool_calls.
Check if the model requires an API key.