Main interface for LLM interactions.
The broker manages communication with LLM providers through gateways, handles tool calling with automatic recursion, and provides a clean API for text and structured output generation.
Examples
# Create a broker with Ollama
alias Mojentic.LLM.{Broker, Message}
alias Mojentic.LLM.Gateways.Ollama
broker = Broker.new("qwen3:32b", Ollama)
# Generate a simple response
messages = [Message.user("What is 2+2?")]
{:ok, response} = Broker.generate(broker, messages)
# Generate structured output
schema = %{
type: "object",
properties: %{answer: %{type: "number"}},
required: ["answer"]
}
{:ok, result} = Broker.generate_object(broker, messages, schema)Tool Support
The broker automatically handles tool calls from the LLM. When the LLM requests a tool call, the broker will:
- Execute the tool with the provided arguments
- Add the tool result to the conversation
- Recursively call the LLM to generate the final response
Example with tools:
tools = [MyTool]
{:ok, response} = Broker.generate(broker, messages, tools)
Summary
Functions
Generates text response from the LLM.
Generates structured object response from the LLM.
Generates streaming text response from the LLM.
Creates a new LLM broker.
Types
@type t() :: %Mojentic.LLM.Broker{ correlation_id: String.t() | nil, gateway: Mojentic.LLM.Gateway.gateway(), model: String.t(), tracer: pid() | :null_tracer }
Functions
Generates text response from the LLM.
Handles tool calls automatically through recursion. When the LLM requests a tool call, the broker executes the tool and continues the conversation with the result.
Parameters
broker: Broker instancemessages: List of conversation messagestools: Optional list of tool modules (default: nil)config: Optional completion configuration (default: default config)
Returns
{:ok, response_text}on success{:error, reason}on failure
Examples
broker = Broker.new("qwen3:32b", Ollama)
messages = [Message.user("What is the capital of France?")]
{:ok, response} = Broker.generate(broker, messages)
# => {:ok, "The capital of France is Paris."}
# With tools
tools = [WeatherTool]
messages = [Message.user("What's the weather in SF?")]
{:ok, response} = Broker.generate(broker, messages, tools)
Generates structured object response from the LLM.
Uses JSON schema to enforce the structure of the response. The LLM will return a JSON object conforming to the provided schema.
Parameters
broker: Broker instancemessages: List of conversation messagesschema: JSON schema for the expected response structureconfig: Optional completion configuration (default: default config)
Returns
{:ok, parsed_object}map conforming to schema on success{:error, reason}on failure
Examples
broker = Broker.new("qwen3:32b", Ollama)
schema = %{
type: "object",
properties: %{
sentiment: %{type: "string"},
confidence: %{type: "number"}
},
required: ["sentiment", "confidence"]
}
messages = [Message.user("I love this!")]
{:ok, result} = Broker.generate_object(broker, messages, schema)
# => {:ok, %{"sentiment" => "positive", "confidence" => 0.95}}
Generates streaming text response from the LLM.
Yields content chunks as they arrive, and handles tool calls automatically through recursion. When tool calls are detected, the broker executes them and recursively streams the LLM's follow-up response.
Parameters
broker: Broker instancemessages: List of conversation messagestools: Optional list of tool modules (default: nil)config: Optional completion configuration (default: default config)
Returns
A stream that yields content strings as they arrive.
Examples
broker = Broker.new("qwen3:32b", Ollama)
messages = [Message.user("Tell me a story")]
broker
|> Broker.generate_stream(messages)
|> Stream.each(&IO.write/1)
|> Stream.run()
# With tools
tools = [DateTool]
messages = [Message.user("What's the date tomorrow?")]
broker
|> Broker.generate_stream(messages, tools)
|> Stream.each(&IO.write/1)
|> Stream.run()
Creates a new LLM broker.
Parameters
model: Model identifier (e.g., "qwen3:32b", "gpt-4")gateway: Gateway module (e.g.,Mojentic.LLM.Gateways.Ollama)opts: Optional keyword list::correlation_id- Correlation ID for request tracking (default: auto-generated):tracer- Tracer system for observability (default: null_tracer)
Examples
iex> Broker.new("qwen3:32b", Mojentic.LLM.Gateways.Ollama)
%Broker{model: "qwen3:32b", gateway: Mojentic.LLM.Gateways.Ollama, ...}
iex> Broker.new("qwen3:32b", Mojentic.LLM.Gateways.Ollama, correlation_id: "custom-id-123")
%Broker{model: "qwen3:32b", gateway: Mojentic.LLM.Gateways.Ollama, correlation_id: "custom-id-123"}
iex> {:ok, tracer} = TracerSystem.start_link()
iex> Broker.new("qwen3:32b", Mojentic.LLM.Gateways.Ollama, tracer: tracer)
%Broker{model: "qwen3:32b", gateway: Mojentic.LLM.Gateways.Ollama, tracer: tracer}