ReqLLM.Context (ReqLLM v1.0.0)
View SourceContext represents a conversation history as a collection of messages.
Provides canonical message constructor functions that can be imported
for clean, readable message creation. Supports standard roles:
:user, :assistant, :system, and :tool.
Example
import ReqLLM.Context
context = Context.new([
system("You are a helpful assistant"),
user("What's the weather like?"),
assistant("I'll check that for you")
])
Context.validate!(context)
Summary
Functions
Append a message to the context.
Create an assistant message with optional tool calls and metadata.
Build an assistant message with a tool call.
Build an assistant message with multiple tool calls.
Create an assistant message with tool calls.
Build a message from role and content parts (metadata optional).
Concatenate two contexts.
Execute a list of tool calls and append their results to the context.
Merges the original context with a response to create an updated context.
Create a new Context from a list of messages (defaults to empty).
Normalize any "prompt-ish" input into a validated ReqLLM.Context.
Bang version of normalize/2 that raises on error.
Prepend a message to the context.
Create a system message with optional metadata.
Build a text-only message for the given role.
Return the underlying message list.
Create a tool result message with tool_call_id and content.
Create a tool result message with tool_call_id, name, and content.
Build a tool result message.
Create a user message with optional metadata.
Validate context: ensures valid messages, at most one system message, and tool message constraints.
Bang version of validate/1; raises ReqLLM.Error.Validation.Error on invalid context.
Build a message with text and an image URL for the given role.
Wrap a context with provider-specific tagged struct.
Types
@type t() :: %ReqLLM.Context{messages: [ReqLLM.Message.t()], tools: [ReqLLM.Tool.t()]}
Functions
@spec append(t(), ReqLLM.Message.t()) :: t()
@spec append(t(), [ReqLLM.Message.t()]) :: t()
Append a message to the context.
@spec assistant([ReqLLM.Message.ContentPart.t()] | String.t(), map() | keyword()) :: ReqLLM.Message.t()
Create an assistant message with optional tool calls and metadata.
Accepts a string or content parts list. Second argument can be a map (legacy) or keyword list with options including tool_calls.
Options
:tool_calls- List of tool calls (ToolCall structs, tuples, or maps):metadata- Map of metadata to attach to the message (default: %{})
Examples
assistant("Hello")
assistant("", tool_calls: [ToolCall.new("id", "get_weather", ~s({"location":"SF"}))])
assistant("Let me check", tool_calls: [{"get_weather", %{location: "SF"}}])
assistant([ContentPart.text("Hi")], metadata: %{})
@spec assistant_tool_call(String.t(), term(), keyword()) :: ReqLLM.Message.t()
Build an assistant message with a tool call.
@spec assistant_tool_calls( [%{id: String.t(), name: String.t(), input: term()}], map() ) :: ReqLLM.Message.t()
Build an assistant message with multiple tool calls.
@spec assistant_with_tools([ReqLLM.ToolCall.t()], String.t() | nil) :: ReqLLM.Message.t()
Create an assistant message with tool calls.
@spec build(atom(), [ReqLLM.Message.ContentPart.t()], map()) :: ReqLLM.Message.t()
Build a message from role and content parts (metadata optional).
Concatenate two contexts.
@spec execute_and_append_tools(t(), [map()], [ReqLLM.Tool.t()]) :: t()
Execute a list of tool calls and append their results to the context.
Takes a list of tool call maps (with :id, :name, :arguments keys) and a list of available tools, executes each call, and appends the results as tool messages.
Parameters
context- The context to append results totool_calls- List of tool call maps with :id, :name, :argumentsavailable_tools- List of ReqLLM.Tool structs to execute against
Returns
Updated context with tool result messages appended.
Examples
tool_calls = [%{id: "call_1", name: "calculator", arguments: %{"operation" => "add", "a" => 2, "b" => 3}}]
context = Context.execute_and_append_tools(context, tool_calls, tools)
@spec merge_response(t(), ReqLLM.Response.t(), keyword()) :: ReqLLM.Response.t()
Merges the original context with a response to create an updated context.
Takes a context and a response, then creates a new context containing the original messages plus the assistant response message.
Parameters
context- Original ReqLLM.Contextresponse- ReqLLM.Response containing the assistant message
Returns
- Updated response with merged context
Examples
context = ReqLLM.Context.new([user("Hello")])
response = %ReqLLM.Response{message: assistant("Hi there!")}
updated_response = ReqLLM.Context.merge_response(context, response)
# response.context now contains both user and assistant messages
@spec new([ReqLLM.Message.t()]) :: t()
Create a new Context from a list of messages (defaults to empty).
@spec normalize( String.t() | ReqLLM.Message.t() | t() | map() | [String.t() | ReqLLM.Message.t() | t() | map()], keyword() ) :: {:ok, t()} | {:error, term()}
Normalize any "prompt-ish" input into a validated ReqLLM.Context.
Accepts various input types and converts them to a proper Context struct:
- String: converts to user message
- Message struct: wraps in Context
- Context struct: passes through
- List: processes each item and creates Context from all messages
- Loose maps: converts to Message if they have role/content keys
Options
:system_prompt- String to add as system message if none exists:validate- Boolean to run validation (default: true):convert_loose- Boolean to allow loose maps with role/content (default: true)
Examples
# String to user message
Context.normalize("Hello")
#=> {:ok, %Context{messages: [%Message{role: :user, content: [%ContentPart{text: "Hello"}]}]}}
# Add system prompt
Context.normalize("Hello", system_prompt: "You are helpful")
#=> {:ok, %Context{messages: [%Message{role: :system}, %Message{role: :user}]}}
# List of mixed types
Context.normalize([%Message{role: :system}, "Hello"])
@spec normalize!( String.t() | ReqLLM.Message.t() | t() | map() | [String.t() | ReqLLM.Message.t() | t() | map()], keyword() ) :: t()
Bang version of normalize/2 that raises on error.
@spec prepend(t(), ReqLLM.Message.t()) :: t()
Prepend a message to the context.
@spec system([ReqLLM.Message.ContentPart.t()] | String.t(), map() | keyword()) :: ReqLLM.Message.t()
Create a system message with optional metadata.
Accepts a string or content parts list. Second argument can be a map (legacy) or keyword list with options.
Options
:metadata- Map of metadata to attach to the message (default: %{})
Examples
system("You are helpful")
system("You are helpful", %{version: 1})
system("You are helpful", metadata: %{version: 1})
@spec text(atom(), String.t(), map()) :: ReqLLM.Message.t()
Build a text-only message for the given role.
@spec to_list(t()) :: [ReqLLM.Message.t()]
Return the underlying message list.
@spec tool_result(String.t(), String.t()) :: ReqLLM.Message.t()
Create a tool result message with tool_call_id and content.
@spec tool_result(String.t(), String.t(), String.t()) :: ReqLLM.Message.t()
Create a tool result message with tool_call_id, name, and content.
@spec tool_result_message(String.t(), String.t(), term(), map()) :: ReqLLM.Message.t()
Build a tool result message.
@spec user([ReqLLM.Message.ContentPart.t()] | String.t(), map() | keyword()) :: ReqLLM.Message.t()
Create a user message with optional metadata.
Accepts a string or content parts list. Second argument can be a map (legacy) or keyword list with options.
Options
:metadata- Map of metadata to attach to the message (default: %{})
Examples
user("Hello")
user("Hello", %{source: "api"})
user("Hello", metadata: %{source: "api"})
user([ContentPart.text("Hello")], metadata: %{})
Validate context: ensures valid messages, at most one system message, and tool message constraints.
Bang version of validate/1; raises ReqLLM.Error.Validation.Error on invalid context.
@spec with_image(atom(), String.t(), String.t(), map()) :: ReqLLM.Message.t()
Build a message with text and an image URL for the given role.
@spec wrap(t(), ReqLLM.Model.t()) :: term()
Wrap a context with provider-specific tagged struct.
Takes a ReqLLM.Context and ReqLLM.Model and wraps the context
in the appropriate provider-specific struct for encoding/decoding.
Parameters
context- AReqLLM.Contextto wrapmodel- AReqLLM.Modelindicating the provider
Returns
- Provider-specific tagged struct ready for encoding
Examples
context = ReqLLM.Context.new([ReqLLM.Context.user("Hello")])
model = ReqLLM.Model.from("anthropic:claude-3-haiku-20240307")
tagged = ReqLLM.Context.wrap(context, model)
#=> %ReqLLM.Providers.Anthropic.Context{context: context}