AshAi.Actions.Prompt (ash_ai v0.2.12)
View SourceA generic action impl that returns structured outputs from an LLM matching the action return.
Typically used via prompt/2
, for example:
action :analyze_sentiment, :atom do
constraints one_of: [:positive, :negative]
description """
Analyzes the sentiment of a given piece of text to determine if it is overall positive or negative.
Does not consider swear words as inherently negative.
"""
argument :text, :string do
allow_nil? false
description "The text for analysis."
end
run prompt(
LangChain.ChatModels.ChatOpenAI.new!(%{ model: "gpt-4o"}),
# setting `tools: true` allows it to use all exposed tools in your app
tools: true
# alternatively you can restrict it to only a set of tools
# tools: [:list, :of, :tool, :names]
# provide an optional prompt, which is an EEx template
# prompt: "Analyze the sentiment of the following text: <%= @input.arguments.description %>"
)
end
The first argument to prompt/2
is the LangChain
model. It can also be a 2-arity function which will be invoked
with the input and the context, useful for dynamically selecting the model.
Dynamic Configuration (using 2-arity function)
For runtime configuration (like using environment variables), pass a function
as the first argument to prompt/2
:
run prompt(
fn _input, _context ->
LangChain.ChatModels.ChatOpenAI.new!(%{
model: "gpt-4o",
# this can also be configured in application config, see langchain docs for more.
api_key: System.get_env("OPENAI_API_KEY"),
endpoint: System.get_env("OPENAI_ENDPOINT")
})
end,
tools: false
)
This function will be executed just before the prompt is sent to the LLM.
Options
:tools
: A list of tool names to expose to the agent call.:verbose?
: Set totrue
for more output to be logged.:prompt
: A custom prompt. Supports multiple formats - see the prompt section below.
Prompt
The prompt by default is generated using the action and input descriptions. You can provide your own prompt
via the prompt
option which supports multiple formats based on the type of data provided:
Supported Formats
- String (EEx template):
"Analyze this: <%= @input.arguments.text %>"
- {System, User} tuple:
{"You are an expert", "Analyze the sentiment"}
- Function:
fn input, context -> {"Dynamic system", "Dynamic user"} end
- List of LangChain Messages:
[Message.new_system!("..."), Message.new_user!("...")]
- Function returning Messages:
fn input, context -> [Message.new_system!("...")] end
Examples
Basic String Template
run prompt(
ChatOpenAI.new!(%{model: "gpt-4o"}),
prompt: "Analyze the sentiment of: <%= @input.arguments.text %>"
)
System/User Tuple
run prompt(
ChatOpenAI.new!(%{model: "gpt-4o"}),
prompt: {"You are a sentiment analyzer", "Analyze: <%= @input.arguments.text %>"}
)
LangChain Messages for Multi-turn Conversations
run prompt(
ChatOpenAI.new!(%{model: "gpt-4o"}),
prompt: [
Message.new_system!("You are an expert assistant"),
Message.new_user!("Hello, how can you help me?"),
Message.new_assistant!("I can help with various tasks"),
Message.new_user!("Great! Please analyze this data")
]
)
Image Analysis with Templates
run prompt(
ChatOpenAI.new!(%{model: "gpt-4o"}),
prompt: [
Message.new_system!("You are an expert at image analysis"),
Message.new_user!([
PromptTemplate.from_template!("Extra context: <%= @input.arguments.context %>"),
ContentPart.image!("<%= @input.arguments.image_data %>", media: :jpg, detail: "low")
])
]
)
Dynamic Messages via Function
run prompt(
ChatOpenAI.new!(%{model: "gpt-4o"}),
prompt: fn input, context ->
base = [Message.new_system!("You are helpful")]
history = input.arguments.conversation_history
|> Enum.map(fn %{"role" => role, "content" => content} ->
case role do
"user" -> Message.new_user!(content)
"assistant" -> Message.new_assistant!(content)
end
end)
base ++ history
end
)
Template Processing
- String prompts: Processed as EEx templates with
@input
and@context
- Messages with PromptTemplate: Processed using LangChain's
apply_prompt_templates
- Functions: Can return any supported format for dynamic generation
The default prompt template is:
{"You are responsible for performing the `<%= @input.action.name %>` action.\n\n<%= if @input.action.description do %>\n# Description\n<%= @input.action.description %>\n<% end %>\n\n## Inputs\n<%= for argument <- @input.action.arguments do %>\n- <%= argument.name %><%= if argument.description do %>: <%= argument.description %>\n<% end %>\n<% end %>\n",
"# Action Inputs\n\n<%= for argument <- @input.action.arguments,\n {:ok, value} = Ash.ActionInput.fetch_argument(@input, argument.name),\n {:ok, value} = Ash.Type.dump_to_embedded(argument.type, value, argument.constraints) do %>\n - <%= argument.name %>: <%= Jason.encode!(value) %>\n<% end %>\n"}
Summary
Functions
Callback implementation for Ash.Resource.Actions.Implementation.run/3
.
Functions
Callback implementation for Ash.Resource.Actions.Implementation.run/3
.