Builtin generic action implementations.
ReqLLM-based Prompt Actions
The prompt/2 macro accepts ReqLLM-compatible model specifications and uses
ReqLLM for structured output generation.
Examples
action :analyze_sentiment, Sentiment do
argument :text, :string, allow_nil?: false
run prompt("openai:gpt-4o",
prompt: [
%{role: "system", content: "You analyze sentiment."},
%{role: "user", content: "Analyze: <%= @input.arguments.text %>"}
]
)
endPrompt Formats
The :prompt option supports multiple formats:
- String (EEx template):
"Analyze this: <%= @input.arguments.text %>" - {System, User} tuple:
{"You are an expert", "Analyze: <%= @input.arguments.text %>"} - ReqLLM.Context: Pass a context directly (canonical format)
- List of messages: Maps, ReqLLM.Message structs, or mixed
- Function returning any of the above:
fn input, context -> ... end
Using ReqLLM.Context (Recommended)
import ReqLLM.Context
run prompt("openai:gpt-4o",
prompt: fn input, _ctx ->
ReqLLM.Context.new([
system("You are an OCR expert"),
user([
ReqLLM.Message.ContentPart.text("Extract text"),
ReqLLM.Message.ContentPart.image_url(input.arguments.image_url)
])
])
end
)