LlmCore.Tool.Codec (llm_core v0.3.0)

Copy Markdown View Source

Translates between provider-neutral tool structs and provider-specific wire formats.

Three operations, three providers:

  • encode_definitions/2 — Converts [LlmToolkit.Tool.t()] into the provider's request payload format for tool declarations.
  • decode_tool_calls/2 — Extracts [LlmToolkit.Tool.Call.t()] from a raw provider response body.
  • encode_result/2 — Formats a LlmToolkit.Tool.Result.t() as the provider's expected tool-result message.

Provider Format Differences

AspectOpenAIAnthropicOllama
Definition keyparametersinput_schemaparameters
Definition wrapper{type, function: {...}}{name, desc, schema}{type, function: {...}}
Call argumentsJSON stringobjectobject
Call IDpresentpresentabsent
Result role"tool""user""tool"
Result ID reftool_call_idtool_use_id(none)

Summary

Functions

Decodes tool call requests from a raw provider response body into provider-neutral LlmToolkit.Tool.Call structs.

Encodes a list of provider-neutral tool definitions into the wire format expected by the given provider.

Encodes a tool result into the message format the provider expects when feeding results back into the conversation.

Types

provider()

@type provider() :: :openai | :anthropic | :ollama

Functions

decode_tool_calls(response_body, atom)

@spec decode_tool_calls(map(), provider()) :: [LlmToolkit.Tool.Call.t()]

Decodes tool call requests from a raw provider response body into provider-neutral LlmToolkit.Tool.Call structs.

OpenAI

Expects response_body["choices"][0]["message"]["tool_calls"] where each entry has "id", "function" => %{"name" => ..., "arguments" => json_string}.

Anthropic

Expects response_body["content"] to contain blocks with "type" => "tool_use", each having "id", "name", and "input" (object).

Ollama

Expects response_body["message"]["tool_calls"] where each entry has "function" => %{"name" => ..., "arguments" => object}. No IDs.

Returns an empty list when no tool calls are found.

encode_definitions(tools, atom)

@spec encode_definitions([LlmToolkit.Tool.t()], provider()) :: [map()]

Encodes a list of provider-neutral tool definitions into the wire format expected by the given provider.

Examples

iex> tool = %LlmToolkit.Tool{name: "ping", description: "Ping", parameters: %{"type" => "object"}, metadata: %{}}
iex> [encoded] = LlmCore.Tool.Codec.encode_definitions([tool], :openai)
iex> encoded["type"]
"function"
iex> encoded["function"]["name"]
"ping"

encode_result(result, atom)

@spec encode_result(LlmToolkit.Tool.Result.t(), provider()) :: map()

Encodes a tool result into the message format the provider expects when feeding results back into the conversation.

OpenAI

%{"role" => "tool", "tool_call_id" => id, "content" => content}

Anthropic

%{"role" => "user", "content" => [%{"type" => "tool_result", "tool_use_id" => id, "content" => content}]}

Ollama

%{"role" => "tool", "content" => content}