Puck - An AI framework for Elixir.
Quick Start
# Create a client (requires :req_llm dep)
client = Puck.Client.new({Puck.Backends.ReqLLM, "anthropic:claude-sonnet-4-5"},
system_prompt: "You are a helpful assistant."
)
# Simple call
{:ok, response, _ctx} = Puck.call(client, "Hello!")
# Multi-turn conversation
context = Puck.Context.new()
{:ok, response, context} = Puck.call(client, "Hello!", context)
{:ok, response, context} = Puck.call(client, "Follow-up question", context)
# Stream responses
{:ok, stream, _ctx} = Puck.stream(client, "Tell me a story")
Enum.each(stream, fn chunk -> IO.write(chunk.content) end)Core Concepts
Puck.Client- Configuration struct for an LLM client (backend, system prompt, hooks)Puck.Context- Conversation history and metadataPuck.Content- Multi-modal content (text, images, files, audio, video)Puck.Message- Individual message in a conversationPuck.Backend- Behaviour for LLM backend implementationsPuck.Hooks- Behaviour for lifecycle event hooksPuck.Response- Normalized response struct with content, finish_reason, usage
Optional Packages
:req_llm- Multi-provider LLM backend (enablesPuck.Backends.ReqLLM):solid- Prompt templates with Liquid syntax (enablesPuck.Prompt.Solid):telemetry- Telemetry integration for observability:zoi- Schema validation for structured outputs
Summary
Functions
Calls an LLM and returns the response.
Returns
{:ok, response, context}on success{:error, reason}on failure
Examples
# Simple call
client = Puck.Client.new({Puck.Backends.ReqLLM, "anthropic:claude-sonnet-4-5"})
{:ok, response, _ctx} = Puck.call(client, "Hello!")
# With system prompt
client = Puck.Client.new({Puck.Backends.ReqLLM, "anthropic:claude-sonnet-4-5"},
system_prompt: "You are a translator."
)
{:ok, response, _ctx} = Puck.call(client, "Translate to Spanish")
# Multi-turn conversation
context = Puck.Context.new()
{:ok, response, context} = Puck.call(client, "Hello!", context)
{:ok, response, context} = Puck.call(client, "Follow-up question", context)
Streams an LLM response.
Returns
{:ok, stream, context}where stream is anEnumerableof chunks{:error, reason}on failure
Examples
client = Puck.Client.new({Puck.Backends.ReqLLM, "anthropic:claude-sonnet-4-5"})
{:ok, stream, _ctx} = Puck.stream(client, "Tell me a story")
Enum.each(stream, fn chunk -> IO.write(chunk.content) end)