Generates the resources and views for a conversational UI backed by ash_postgres and ash_oban
Generates a complete chat feature for your Ash & Phoenix application, including Ash resources for
conversations and messages, Oban background jobs for LLM responses, PubSub-based real-time streaming,
and optional Phoenix LiveView/LiveComponent UIs. Uses AshAi.ToolLoop.stream/2 with ReqLLM for
incremental streaming responses.
This is primarily a tool to get started with chat features and is by no means intended to handle every case you can come up with. The generated code is yours to customize.
Quick Start
From scratch with a new Phoenix app:
mix igniter.new my_app \
--with phx.new \
--install ash,ash_postgres,ash_phoenix \
--install ash_authentication_phoenix,ash_oban \
--install ash_ai@github:ash-project/ash_ai \
--auth-strategy password
Then generate the chat feature:
mix ash_ai.gen.chat --live
Set your LLM API key (OpenAI by default):
export OPENAI_API_KEY=sk-...
Start the server and visit http://localhost:4000/chat.
Examples
Resources only, no UI:
mix ash_ai.gen.chat --user MyApp.Accounts.User
# Creates: MyApp.Chat domain with Conversation and Message resources
Full-page LiveView with a named domain and Anthropic provider:
mix ash_ai.gen.chat --user MyApp.Accounts.User --live --provider anthropic --domain MyApp.SupportChat
# Creates: MyApp.SupportChat resources and SupportChatLive mounted at /chat
Embeddable LiveComponent with a custom domain and Gemini provider:
mix ash_ai.gen.chat --user MyApp.Accounts.User --live-component --domain MyApp.SupportChat --provider gemini
# Creates: MyApp.SupportChat resources and SupportChatComponent
Both LiveView and LiveComponent with all options:
mix ash_ai.gen.chat --user MyApp.Accounts.User --live --live-component --domain MyApp.SupportChat --route /support/chat --provider anthropic
# Creates: MyApp.SupportChat resources, SupportChatLive at /support/chat, and SupportChatComponent
Options
--user- The user resource module. If omitted, looks forYourApp.Accounts.Userautomatically. If no user resource is found, the generator still works but produces resources without user associations or actor-based filtering.--domain- The domain module to place the resources in. E.g.,--domain MyApp.SupportChatgeneratesMyApp.SupportChat.ConversationandMyApp.SupportChat.Message. Defaults toYourApp.Chat.--route- The URL path for the chat routes. Defaults to/chat. Mounts bothrouteandroute/:conversation_id.--provider- The LLM provider to use:openai(default),anthropic, orgemini. Sets the default model and configures the appropriate API key inconfig/runtime.exs.--extend- Extensions to apply to the generated resources, passed through tomix ash.gen.resource.--live- Generate a full-page Phoenix LiveView for the chat UI.--live-component- Generate a reusable Phoenix LiveComponent for embedding the chat UI in existing pages.--yes- Skip confirmation prompts.
What Gets Generated
Dependencies
The generator ensures the following dependencies are installed and configured:
ash_phoenix- for forms and code interfacesash_oban- for background job processingmdex- for Markdown rendering in the UI
Domain Module (YourApp.Chat)
A domain with AshPhoenix and AshAi extensions, providing code interfaces:
create_conversation/1- create a new conversationget_conversation/1- fetch a conversation by IDmy_conversations/0(orlist_conversations/0without a user) - list conversations for the current actorcreate_message/1- send a message (triggers LLM response via Oban)message_history/1- fetch messages for a conversation, sorted byinserted_atdesc
Conversation Resource
- Attributes:
id(UUID v7),title(string),inserted_at,updated_at - Relationships:
has_many :messages,belongs_to :user(when user resource is provided) - Actions:
:create- acceptstitle, relates actor as user:read- default read:destroy- default destroy:my_conversations- filtered touser_id == actor(:id)(when user resource is provided):generate_name- uses the LLM to generate a 2-8 word title from the first 10 messages
- Calculations:
needs_title- true when title is nil and the conversation has more than 3 messages (or more than 1 message and is older than 10 minutes) - Extensions:
postgres,AshOban
Message Resource
- Attributes:
id(UUID v7, writable)text(string, required, allows empty, no trimming)tool_calls(array of maps) - structured tool invocation datatool_results(array of maps) - tool execution resultssource(enum::user|:agent, default:user)complete(boolean, defaulttrue) - false while streaminginserted_at,updated_at
- Relationships:
belongs_to :conversation(required),belongs_to :response_to(self-referential),has_one :response - Actions:
:create- acceptstext, validates non-empty, optionally takesconversation_id(creates a new conversation if omitted), triggers the:respondOban job:read- default read:destroy- default destroy:for_conversation- keyset-paginated read filtered byconversation_id, sorted byinserted_atdesc:respond- update action that runs theRespondchange (streams LLM response viaAshAi.ToolLoop.stream/2):upsert_response- creates or atomically updates the agent's response message, appending streamed text chunks and tool call/result data
- Calculations:
needs_response- true whensource == :userand no response message exists - Extensions:
postgres,AshOban
Respond Change
The generated Respond change module:
- Loads the full message history for the conversation
- Builds a prompt chain with a system message ("You are a helpful chat bot...") followed by the message history
- Calls
AshAi.ToolLoop.stream/2withtools: true(all AshAi domain tools available) - Streams content chunks, upserting the response message as tokens arrive (enabling real-time UI updates via PubSub)
- Accumulates tool calls and tool results during the stream
- Finalizes the response message with the complete text, tool calls, and tool results
- Handles stream errors gracefully with user-facing error messages
GenerateName Change
Automatically generates a conversation title by sending the first 10 messages to the LLM with
a system prompt requesting a 2-8 word name. Triggered by the :name_conversation Oban trigger
when needs_title is true.
Oban Triggers
:respond- runs on the Message resource whenneeds_responseis true. Queue:chat_responses(limit 10).:name_conversation- runs on the Conversation resource whenneeds_titleis true. Queue:conversations(limit 10).
Both triggers use scheduler_cron false (event-driven, not polled) and lock_for_update? false.
When a user resource is provided, an AiAgentActorPersister module is generated to serialize/deserialize
the actor for Oban jobs. The persisted user gets chat_agent?: true metadata so you can differentiate
agent-initiated actions in policies.
Configuration
The generator adds to your app config:
config/runtime.exs- ReqLLM API key for the selected providerconfig/config.exs- Oban queue configuration (chat_responsesandconversations, limit 10 each)
Provider Models
The default model for each provider:
openai→openai:gpt-4oanthropic→anthropic:claude-sonnet-4-5gemini→google:gemini-1.5-pro
Change the model string in the generated Respond and GenerateName change modules to use a different model.
Model strings follow the "provider:model-name" format from ReqLLM.
LiveView (--live)
Generates a full-page Phoenix LiveView with:
- Conversation sidebar - lists conversations, "New Chat" button, highlights the active conversation
- Message stream - displays messages in a chat bubble layout with avatar icons, auto-scrolls to latest
- Message input - text input with send button, auto-focuses on mount
- Real-time streaming - subscribes to PubSub topics for the active conversation, updates messages as they stream in
- Agent responding indicator - shows a loading animation while the LLM is generating
- Markdown rendering - agent messages are rendered as HTML via MDEx with GitHub-flavored extensions (strikethrough, tables, autolinks, task lists, footnotes, code highlighting)
- Tool call/result badges - displays tool invocations and results inline with messages
- Responsive drawer - sidebar collapses on mobile with a hamburger toggle
Routes are added to your router inside the ash_authentication_live_session block:
live "/chat", ChatLive
live "/chat/:conversation_id", ChatLivePubSub Topics
The LiveView subscribes to:
chat:messages:<conversation_id>- new and updated messages for the active conversationchat:conversations:<user_id>- conversation creates/updates (for sidebar)
Prerequisites
The chat UI templates use Tailwind CSS and DaisyUI for styling. DaisyUI is included in Phoenix 1.8+. For older Phoenix apps, install DaisyUI first.
LiveComponent (--live-component)
Generates a reusable Phoenix.LiveComponent with the same features as the LiveView, but embeddable
in existing pages. After generation, you'll see a notice with integration instructions.
Usage in your parent LiveView:
<.live_component
module={YourAppWeb.ChatComponent}
id="chat"
current_user={@current_user}
conversation_id={@conversation_id}
hide_sidebar={false}
/>Your parent LiveView must:
- Subscribe to PubSub and forward broadcasts:
def mount(_params, _session, socket) do
if connected?(socket) do
YourAppWeb.ChatComponent.subscribe(socket.assigns.current_user)
end
{:ok, socket}
end
def handle_info(%Phoenix.Socket.Broadcast{} = broadcast, socket) do
send_update(YourAppWeb.ChatComponent, id: "chat", broadcast: broadcast)
{:noreply, socket}
end- Handle navigation events from the component:
def handle_info({:chat_component_navigate, conversation_id}, socket) do
{:noreply, assign(socket, :conversation_id, conversation_id)}
endTool Call/Result UI Extraction
Generated chat UI modules delegate tool call and tool result parsing to AshAi.ChatUI.Tools.extract/1.
This keeps generated modules small while preserving a stable override seam.
Override in generated modules if you need custom parsing:
@chat_ui_tools MyApp.ChatUIToolsYour custom module must implement extract/1 returning {:ok, %{tool_calls: [...], tool_results: [...]}} or {:error, reason}.
Starter Tools
Generated chat domains include a small default tool set so tool calling works immediately:
:chat_list_conversations- lists conversations visible to the actor.:chat_message_history- fetches messages for a specific conversation.
These are registered in the domain's tools block. The generated Respond change uses tools: true
to make all tools from all AshAi-enabled domains available to the LLM. To restrict which tools are
available, change tools: true to tools: [:tool_name_1, :tool_name_2] in the generated Respond module.
Adding Your Own Tools
Expose Ash actions as tools in any domain:
defmodule MyApp.Blog do
use Ash.Domain, extensions: [AshAi]
tools do
tool :read_posts, MyApp.Blog.Post, :read
tool :create_post, MyApp.Blog.Post, :create
end
endThese tools become automatically available to the chat LLM when tools: true is set.
Customizing the System Prompt
The generated Respond change module contains a default system prompt:
You are a helpful chat bot. Your job is to use the tools at your disposal to assist the user.
Edit this directly in the generated change module to customize the LLM's behavior.