Translates between Planck's types and req_llm's call interface.
This is the only module that knows req_llm's input and output shapes.
Everything above this layer works exclusively with Planck.AI structs.
Summary
Functions
Converts a Planck.AI.Model, Planck.AI.Context, and call-site opts into
the three arguments expected by req_llm's stream_text/3:
{model_spec_string, req_llm_context, opts}.
Functions
@spec to_req_llm(Planck.AI.Model.t(), Planck.AI.Context.t(), keyword()) :: {model_spec :: String.t() | map(), req_llm_context :: ReqLLM.Context.t(), opts :: keyword()}
Converts a Planck.AI.Model, Planck.AI.Context, and call-site opts into
the three arguments expected by req_llm's stream_text/3:
{model_spec_string, req_llm_context, opts}.
Tools from the context are added to opts as %ReqLLM.Tool{} structs.
Inference params (e.g. temperature:, max_tokens:) in opts are forwarded
directly to req_llm, which handles per-provider translation.
Examples
iex> model = %Planck.AI.Model{id: "claude-sonnet-4-6", provider: :anthropic, context_window: 200_000, max_tokens: 8_096}
iex> context = %Planck.AI.Context{messages: []}
iex> {model_spec, _ctx, _opts} = Planck.AI.Adapter.to_req_llm(model, context, [])
iex> model_spec
"anthropic:claude-sonnet-4-6"