Jido.AI.Prompt (Jido AI v0.5.2)
View SourceA module that provides struct-based prompt generation.
The struct-based approach provides full visibility into the prompt's content when inspecting the agent state, making it easier to debug and understand the conversation flow.
Examples
# Create a simple prompt with a single message
prompt = Jido.AI.Prompt.new(%{
messages: [
%{role: :user, content: "Hello"}
]
})
# Create a prompt with EEx templates
prompt = Jido.AI.Prompt.new(%{
messages: [
%{role: :system, content: "You are a <%= @assistant_type %>", engine: :eex},
%{role: :user, content: "Hello <%= @name %>", engine: :eex}
],
params: %{
assistant_type: "helpful assistant",
name: "Alice"
}
})
# Create a prompt with Liquid templates
prompt = Jido.AI.Prompt.new(%{
messages: [
%{role: :system, content: "You are a {{ assistant_type }}", engine: :liquid},
%{role: :user, content: "Hello {{ name }}", engine: :liquid}
],
params: %{
assistant_type: "helpful assistant",
name: "Alice"
}
})
# Render the prompt to get the final messages
messages = Jido.AI.Prompt.render(prompt)
# => [
# %{role: :system, content: "You are a helpful assistant"},
# %{role: :user, content: "Hello Alice"}
# ]
Summary
Functions
Adds a new message to the prompt.
Compares two versions of a prompt and returns the differences.
Gets a specific version of the prompt.
Lists all available versions of the prompt.
Creates a new prompt struct from the given attributes.
Creates a new prompt with a single message.
Creates a new version of the prompt.
Renders the prompt into a list of messages with interpolated content.
Renders the prompt with options for an LLM API call.
Converts the prompt to a single text string.
Validates and converts the prompt option.
Sets the max_tokens option.
Creates a new output schema from the given schema specification.
Adds LLM options to the prompt.
Sets the output schema for validating LLM responses.
Sets the stop sequences option.
Sets the temperature option.
Sets the timeout option in milliseconds.
Sets the top_p option.
Types
@type t() :: %Jido.AI.Prompt{ history: [map()], id: String.t(), messages: [Jido.AI.Prompt.MessageItem.t()], metadata: map(), options: keyword(), output_schema: NimbleOptions.t(), params: map(), version: non_neg_integer() }
A complete prompt with messages and parameters
Functions
Adds a new message to the prompt.
Enforces the rule that system messages can only appear first in the message list. Raises ArgumentError if attempting to add a system message in any other position.
Examples
iex> alias Jido.AI.Prompt
iex> prompt = Prompt.new(%{
...> messages: [
...> %{role: :user, content: "Hello"}
...> ]
...> })
iex> updated = Prompt.add_message(prompt, :assistant, "Hi there!")
iex> length(updated.messages)
2
iex> List.last(updated.messages).content
"Hi there!"
@spec compare_versions(t(), non_neg_integer(), non_neg_integer()) :: {:ok, %{added_messages: [map()], removed_messages: [map()]}} | {:error, String.t()}
Compares two versions of a prompt and returns the differences.
Examples
iex> alias Jido.AI.Prompt
iex> prompt = Prompt.new(:user, "Hello")
iex> v2 = Prompt.new_version(prompt, fn p -> Prompt.add_message(p, :assistant, "Hi there!") end)
iex> {:ok, diff} = Prompt.compare_versions(v2, 2, 1)
iex> diff.added_messages
[%{role: :assistant, content: "Hi there!"}]
@spec get_version(t(), non_neg_integer()) :: {:ok, t()} | {:error, String.t()}
Gets a specific version of the prompt.
Returns the current prompt if version matches the current version, or reconstructs a historical version from the history.
Examples
iex> alias Jido.AI.Prompt
iex> prompt = Prompt.new(:user, "Hello")
iex> updated = Prompt.new_version(prompt, fn p -> Prompt.add_message(p, :assistant, "Hi there!") end)
iex> {:ok, original} = Prompt.get_version(updated, 1)
iex> length(original.messages)
1
iex> hd(original.messages).content
"Hello"
@spec list_versions(t()) :: [non_neg_integer()]
Lists all available versions of the prompt.
Returns a list of version numbers, with the most recent first.
Examples
iex> alias Jido.AI.Prompt
iex> prompt = Prompt.new(:user, "Hello")
iex> v2 = Prompt.new_version(prompt, fn p -> Prompt.add_message(p, :assistant, "Hi there!") end)
iex> v3 = Prompt.new_version(v2, fn p -> Prompt.add_message(p, :user, "How are you?") end)
iex> Prompt.list_versions(v3)
[3, 2, 1]
Creates a new prompt struct from the given attributes.
Rules
- Only one system message is allowed
- If present, the system message must be the first message
- Messages are rendered without the engine field
Examples
iex> alias Jido.AI.Prompt
iex> prompt = Prompt.new(%{
...> messages: [
...> %{role: :user, content: "Hello"}
...> ]
...> })
iex> prompt.messages |> length()
1
iex> hd(prompt.messages).role
:user
iex> hd(prompt.messages).content
"Hello"
Creates a new prompt with a single message.
This is a convenience function for creating a prompt with a single message.
Examples
iex> alias Jido.AI.Prompt
iex> prompt = Prompt.new(:user, "Hello")
iex> prompt.messages |> length()
1
iex> hd(prompt.messages).role
:user
iex> hd(prompt.messages).content
"Hello"
Creates a new version of the prompt.
This function creates a new version of the prompt by:
- Storing the current state in the history
- Incrementing the version number
- Applying the changes to the prompt
Examples
iex> alias Jido.AI.Prompt
iex> prompt = Prompt.new(:user, "Hello")
iex> updated = Prompt.new_version(prompt, fn p -> Prompt.add_message(p, :assistant, "Hi there!") end)
iex> updated.version
2
iex> length(updated.history)
1
iex> length(updated.messages)
2
Renders the prompt into a list of messages with interpolated content.
The rendered messages will only include the role and content fields, excluding the engine field to ensure compatibility with API requests.
Examples
iex> alias Jido.AI.Prompt
iex> prompt = Prompt.new(%{
...> messages: [
...> %{role: :user, content: "Hello <%= @name %>", engine: :eex}
...> ],
...> params: %{name: "Alice"}
...> })
iex> Prompt.render(prompt)
[%{role: :user, content: "Hello Alice"}]
Renders the prompt with options for an LLM API call.
Returns a map containing both the rendered messages and any LLM-specific options.
Examples
iex> alias Jido.AI.Prompt
iex> prompt = Prompt.new(:user, "Generate a story")
...> |> Prompt.with_temperature(0.7)
...> |> Prompt.with_max_tokens(500)
iex> result = Prompt.render_with_options(prompt)
iex> result[:messages]
[%{role: :user, content: "Generate a story"}]
iex> result[:temperature]
0.7
iex> result[:max_tokens]
500
Converts the prompt to a single text string.
This is useful for debugging or when the LLM API expects a single text prompt.
Examples
iex> alias Jido.AI.Prompt
iex> prompt = Prompt.new(%{
...> messages: [
...> %{role: :system, content: "You are an assistant"},
...> %{role: :user, content: "Hello"}
...> ]
...> })
iex> Prompt.to_text(prompt)
"[system] You are an assistant\n[user] Hello"
Validates and converts the prompt option.
Accepts either:
- A string, which is converted to a system message in a Prompt struct
- An existing Prompt struct, which is returned as-is
Examples
iex> Jido.AI.Prompt.validate_prompt_opts("You are a helpful assistant")
{:ok, %Jido.AI.Prompt{messages: [%Jido.AI.Prompt.MessageItem{role: :system, content: "You are a helpful assistant", engine: :none}], id: nil, version: 1, history: [], params: %{}, metadata: %{}}}
iex> prompt = Jido.AI.Prompt.new(:system, "Custom prompt")
iex> Jido.AI.Prompt.validate_prompt_opts(prompt)
{:ok, prompt}
@spec with_max_tokens(t(), non_neg_integer()) :: t()
Sets the max_tokens option.
Examples
iex> alias Jido.AI.Prompt
iex> prompt = Prompt.new(:user, "Summarize this text")
iex> prompt = Prompt.with_max_tokens(prompt, 500)
iex> prompt.options[:max_tokens]
500
Creates a new output schema from the given schema specification.
Examples
iex> alias Jido.AI.Prompt
iex> schema_spec = [
...> name: [type: :string, required: true],
...> age: [type: :integer, required: true]
...> ]
iex> prompt = Prompt.new(:user, "Generate a person")
iex> prompt = Prompt.with_new_output_schema(prompt, schema_spec)
iex> prompt.output_schema != nil
true
Adds LLM options to the prompt.
Options
:temperature
- Controls randomness. Higher values mean more random completions.:max_tokens
- Maximum number of tokens to generate in the completion.:top_p
- Controls diversity via nucleus sampling.:stop
- Sequences where the API will stop generating tokens.:timeout
- Request timeout in milliseconds.
Examples
iex> alias Jido.AI.Prompt
iex> prompt = Prompt.new(:user, "Generate a creative story")
iex> prompt = Prompt.with_options(prompt, temperature: 0.8, max_tokens: 1000)
iex> prompt.options[:temperature]
0.8
@spec with_output_schema(t(), NimbleOptions.t()) :: t()
Sets the output schema for validating LLM responses.
Examples
iex> alias Jido.AI.Prompt
iex> schema = NimbleOptions.new!([
...> name: [type: :string, required: true],
...> age: [type: :integer, required: true]
...> ])
iex> prompt = Prompt.new(:user, "Generate a person")
iex> prompt = Prompt.with_output_schema(prompt, schema)
iex> prompt.output_schema == schema
true
Sets the stop sequences option.
Examples
iex> alias Jido.AI.Prompt
iex> prompt = Prompt.new(:user, "Write until you reach the end")
iex> prompt = Prompt.with_stop(prompt, ["END", "STOP"])
iex> prompt.options[:stop]
["END", "STOP"]
Sets the temperature option.
Examples
iex> alias Jido.AI.Prompt
iex> prompt = Prompt.new(:user, "Generate a creative story")
iex> prompt = Prompt.with_temperature(prompt, 0.8)
iex> prompt.options[:temperature]
0.8
@spec with_timeout(t(), non_neg_integer()) :: t()
Sets the timeout option in milliseconds.
Examples
iex> alias Jido.AI.Prompt
iex> prompt = Prompt.new(:user, "Process this request")
iex> prompt = Prompt.with_timeout(prompt, 30000)
iex> prompt.options[:timeout]
30000
Sets the top_p option.
Examples
iex> alias Jido.AI.Prompt
iex> prompt = Prompt.new(:user, "Generate diverse responses")
iex> prompt = Prompt.with_top_p(prompt, 0.9)
iex> prompt.options[:top_p]
0.9