A framework for building AI agents in Elixir.
Condukt treats AI agents as first-class OTP processes that can reason, use tools, and orchestrate complex workflows.
Defining an Agent
defmodule MyApp.ResearchAgent do
use Condukt
@impl true
def tools do
[
Condukt.Tools.Read,
Condukt.Tools.Bash
]
end
endRunning an Agent
{:ok, agent} = MyApp.ResearchAgent.start_link(
api_key: "sk-...",
system_prompt: """
You are a research assistant that helps users find information.
Be thorough and cite your sources.
"""
)
{:ok, response} = Condukt.run(agent, "What's new in Elixir 1.18?")Streaming Responses
Condukt.stream(agent, "Explain OTP")
|> Stream.each(fn
{:text, chunk} -> IO.write(chunk)
{:tool_call, name, _id, _args} -> IO.puts("\nCalling: #{name}")
{:tool_result, _id, result} -> IO.puts("Result: #{inspect(result)}")
:done -> IO.puts("\nDone!")
end)
|> Stream.run()Core Concepts
- Session - A GenServer managing conversation state and the agent loop
- Message - User, assistant, or tool result messages in the conversation
- Tool - A capability the agent can invoke (read files, run commands, etc.)
- Sub-agent - A delegated agent session that runs a task in isolation
- Provider - An LLM backend (Anthropic, OpenAI, Ollama, etc.)
- Event - Notifications during agent execution for streaming/UI
Summary
Callbacks
Handles events during execution.
Initializes agent state from options.
Returns the model identifier.
Returns the default sandbox spec for this agent.
Returns the default session secret declarations for this agent.
Returns the sub-agents this agent can delegate work to.
Returns the default system prompt for this agent.
Returns the default thinking level.
Returns the list of tools this agent can use.
Functions
Aborts current operation.
Clears conversation history.
Runs the configured compactor against the conversation history.
Queues a follow-up message.
Returns the conversation history.
Runs a prompt and returns the final response.
Injects a message mid-execution (steering).
Streams a prompt, yielding events as they occur.
Builds an inline tool spec usable in any place a tool module is accepted.
Callbacks
Handles events during execution.
Initializes agent state from options.
@callback model() :: String.t()
Returns the model identifier.
Uses ReqLLM format: "provider:model", e.g., "anthropic:claude-sonnet-4-20250514"
Returns the default sandbox spec for this agent.
Accepts a module, {module, opts}, an already-built Condukt.Sandbox struct,
or nil (the session will default to Condukt.Sandbox.Local). Can be
overridden at start_link/1 via the :sandbox option.
Returns the default session secret declarations for this agent.
Secrets are resolved when the session starts, kept out of model context and
persisted snapshots, and exposed to built-in command tools as environment
variables. Can be overridden at start_link/1 via the :secrets option.
@callback subagents() :: term()
Returns the sub-agents this agent can delegate work to.
Each entry maps a role atom to an agent module, to {agent_module, opts}, or
to a keyword list of session options for an anonymous child agent.
Registration opts are passed to the child session when the sub-agent runs,
except :input/:input_schema and :output/:output_schema, which define
optional structured contracts for the sub-agent tool boundary.
@callback system_prompt() :: String.t() | nil
Returns the default system prompt for this agent.
This can be overridden at start_link/1 via the :system_prompt option.
If neither is provided, the agent will have no system prompt.
@callback thinking_level() :: :off | :minimal | :low | :medium | :high
Returns the default thinking level.
Returns the list of tools this agent can use.
Functions
Aborts current operation.
Clears conversation history.
Runs the configured compactor against the conversation history.
See Condukt.Compactor for details and built-in strategies.
Queues a follow-up message.
This message will be delivered when the agent finishes its current work.
Returns the conversation history.
Runs a prompt and returns the final response.
Two call shapes are supported:
Against a running agent
Pass an agent pid (or registered name) and a prompt. Forwards to the
underlying Condukt.Session.run/3.
{:ok, response} = Condukt.run(agent, "Hello!")
{:ok, response} = Condukt.run(agent, "Hello!", timeout: 60_000)Per-run options:
:timeout- Max time in ms (default: 300_000):max_turns- Max tool use cycles (default: 50):images- List of images to include
Anonymous run (no agent module)
Pass a prompt as the first argument. A transient session is built from the options, the prompt is run, and the session is torn down. This is the scripting entry point: a single function call defines model, system prompt, tools, and (optionally) typed input/output.
{:ok, text} =
Condukt.run("Summarize the README.",
model: "anthropic:claude-haiku-4-5",
tools: [Condukt.Tools.Read]
)
# Inline tools
ls =
Condukt.tool(
name: "ls",
description: "List a directory.",
parameters: %{
type: "object",
properties: %{path: %{type: "string"}},
required: ["path"]
},
call: fn %{"path" => p}, ctx -> Condukt.Sandbox.glob(ctx.sandbox, p <> "/*") end
)
{:ok, text} = Condukt.run("List lib/", tools: [ls])Structured I/O
Pass :output (a JSON Schema map) to switch into structured mode. The
runtime appends a synthetic submit_result tool whose schema matches the
output schema, runs the loop until the model calls it, validates the
submitted value with JSV, and returns
{:ok, validated_map}. Top-level keys are atomized when the schema's
property keys are atoms.
{:ok, %{verdict: "approve", summary: _}} =
Condukt.run("Decide a verdict for this PR and summarize it.",
input: %{repo: "tuist/condukt", pr_number: 42},
input_schema: %{
type: "object",
properties: %{
repo: %{type: "string"},
pr_number: %{type: "integer"}
},
required: ["repo", "pr_number"]
},
output: %{
type: "object",
properties: %{
verdict: %{type: "string", enum: ["approve", "request_changes", "comment"]},
summary: %{type: "string"}
},
required: ["verdict", "summary"]
},
tools: [Condukt.Tools.Read]
)When :input is present, the prompt is treated as instructions and
attached to the system prompt; the args are encoded as the user message.
When :input is absent, the prompt is the user message as-is.
Failure reasons:
{:invalid_input, %JSV.ValidationError{}}- args did not match:input_schema{:invalid_output, %JSV.ValidationError{}}- submitted value failed validation:no_result_submitted- structured mode finished without asubmit_resultcall
Anonymous runs accept all the per-run options above (:timeout,
:max_turns, :images) plus the session options accepted by an agent's
start_link/1 (:model, :system_prompt, :api_key, :base_url,
:thinking_level, :tools, :sandbox, :cwd, :session_store,
:subagents, :compactor, :redactor, :load_project_instructions).
:load_project_instructions defaults to false for anonymous runs.
Injects a message mid-execution (steering).
This message will be delivered after the current tool completes, and remaining tool calls will be skipped.
Streams a prompt, yielding events as they occur.
Events
{:text, chunk}- Text chunk from LLM{:thinking, chunk}- Thinking/reasoning chunk{:tool_call, name, id, args}- Tool being called{:tool_result, id, result}- Tool result{:error, reason}- Error occurred:agent_start- Agent started processing:agent_end- Agent finished:turn_start- New LLM turn starting:turn_end- Turn completed:done- Stream complete
Builds an inline tool spec usable in any place a tool module is accepted.
Returns a struct that Condukt.Session recognizes alongside module-based
tools, so an inline tool can be added to an agent's tools/0 callback or
passed in Condukt.run/2's :tools option.
Required keys
:name- tool name as the LLM will see it:description- human-readable description:parameters- JSON Schema map describing the arguments:call- 2-arity function(args, context)returning{:ok, result}or{:error, reason}
The context map passed to :call matches the context received by
Condukt.Tool callbacks: :agent, :sandbox, :cwd, :opts (always
[] for inline tools).
Example
weather =
Condukt.tool(
name: "get_weather",
description: "Returns the current temperature for a city.",
parameters: %{
type: "object",
properties: %{city: %{type: "string"}},
required: ["city"]
},
call: fn %{"city" => city}, _ctx ->
{:ok, "72F in #{city}"}
end
)
{:ok, _} = Condukt.run("What's the weather in Berlin?", tools: [weather])