ReqLLM.Context (ReqLLM v1.0.0-rc.5)
View SourceContext represents a conversation history as a collection of messages.
Provides canonical message constructor functions that can be imported
for clean, readable message creation. Supports standard roles:
:user
, :assistant
, :system
, and :tool
.
Example
import ReqLLM.Context
context = Context.new([
system("You are a helpful assistant"),
user("What's the weather like?"),
assistant("I'll check that for you")
])
Context.validate!(context)
Summary
Functions
Append a message to the context.
Shortcut for an assistant message; accepts a string or content parts list.
Build an assistant message with a tool call.
Build an assistant message with multiple tool calls.
Build an assistant message from collected text and tool calls.
Build a message from role and content parts (metadata optional).
Concatenate two contexts.
Execute a list of tool calls and append their results to the context.
Deserialize a JSON string or decoded map back into a Context struct.
Bang version of from_json/1 that raises on error.
Merges the original context with a response to create an updated context.
Create a new Context from a list of messages (defaults to empty).
Normalize any "prompt-ish" input into a validated ReqLLM.Context.
Bang version of normalize/2 that raises on error.
Prepend a message to the context.
Append an assistant message to the context.
Prepend a system message to the context.
Append a user message to the context.
Shortcut for a system message; accepts a string or content parts list.
Build a text-only message for the given role.
Return the underlying message list.
Shortcut for a tool message; accepts a string or content parts list.
Build a tool result message.
Shortcut for a user message; accepts a string or content parts list.
Validate context: ensures valid messages and at most one system message.
Bang version of validate/1; raises ReqLLM.Error.Validation.Error on invalid context.
Build a message with text and an image URL for the given role.
Wrap a context with provider-specific tagged struct.
Types
@type t() :: %ReqLLM.Context{messages: [ReqLLM.Message.t()]}
Functions
@spec append(t(), ReqLLM.Message.t()) :: t()
@spec append(t(), [ReqLLM.Message.t()]) :: t()
Append a message to the context.
@spec assistant([ReqLLM.Message.ContentPart.t()] | String.t(), map()) :: ReqLLM.Message.t()
Shortcut for an assistant message; accepts a string or content parts list.
@spec assistant_tool_call(String.t(), term(), keyword()) :: ReqLLM.Message.t()
Build an assistant message with a tool call.
@spec assistant_tool_calls( [%{id: String.t(), name: String.t(), input: term()}], map() ) :: ReqLLM.Message.t()
Build an assistant message with multiple tool calls.
@spec assistant_with_tools(String.t(), [map()], map()) :: ReqLLM.Message.t()
Build an assistant message from collected text and tool calls.
Convenience function for creating assistant messages that may contain both text content and tool calls from streaming responses.
Parameters
text
- Text content from the responsetool_calls
- List of tool call maps with :id, :name, :argumentsmeta
- Optional metadata map
Returns
Assistant message with appropriate content parts.
@spec build(atom(), [ReqLLM.Message.ContentPart.t()], map()) :: ReqLLM.Message.t()
Build a message from role and content parts (metadata optional).
Concatenate two contexts.
@spec execute_and_append_tools(t(), [map()], [ReqLLM.Tool.t()]) :: t()
Execute a list of tool calls and append their results to the context.
Takes a list of tool call maps (with :id, :name, :arguments keys) and a list of available tools, executes each call, and appends the results as tool messages.
Parameters
context
- The context to append results totool_calls
- List of tool call maps with :id, :name, :argumentsavailable_tools
- List of ReqLLM.Tool structs to execute against
Returns
Updated context with tool result messages appended.
Examples
tool_calls = [%{id: "call_1", name: "calculator", arguments: %{"operation" => "add", "a" => 2, "b" => 3}}]
context = Context.execute_and_append_tools(context, tool_calls, tools)
Deserialize a JSON string or decoded map back into a Context struct.
Takes either a JSON string or a map (from Jason.decode!/1
) and reconstructs
a proper Context struct by leveraging the existing normalize/2 function.
Examples
# From JSON string
context = Context.new([Context.user("Hello")])
json_string = Jason.encode!(context)
{:ok, restored_context} = Context.from_json(json_string)
# From already decoded map
decoded_map = Jason.decode!(json_string)
{:ok, restored_context} = Context.from_json(decoded_map)
Bang version of from_json/1 that raises on error.
@spec merge_response(t(), ReqLLM.Response.t()) :: ReqLLM.Response.t()
Merges the original context with a response to create an updated context.
Takes a context and a response, then creates a new context containing the original messages plus the assistant response message.
Parameters
context
- Original ReqLLM.Contextresponse
- ReqLLM.Response containing the assistant message
Returns
- Updated response with merged context
Examples
context = ReqLLM.Context.new([user("Hello")])
response = %ReqLLM.Response{message: assistant("Hi there!")}
updated_response = ReqLLM.Context.merge_response(context, response)
# response.context now contains both user and assistant messages
@spec new([ReqLLM.Message.t()]) :: t()
Create a new Context from a list of messages (defaults to empty).
@spec normalize( String.t() | ReqLLM.Message.t() | t() | map() | [String.t() | ReqLLM.Message.t() | t() | map()], keyword() ) :: {:ok, t()} | {:error, term()}
Normalize any "prompt-ish" input into a validated ReqLLM.Context.
Accepts various input types and converts them to a proper Context struct:
- String: converts to user message
- Message struct: wraps in Context
- Context struct: passes through
- List: processes each item and creates Context from all messages
- Loose maps: converts to Message if they have role/content keys
Options
:system_prompt
- String to add as system message if none exists:validate
- Boolean to run validation (default: true):convert_loose
- Boolean to allow loose maps with role/content (default: true)
Examples
# String to user message
Context.normalize("Hello")
#=> {:ok, %Context{messages: [%Message{role: :user, content: [%ContentPart{text: "Hello"}]}]}}
# Add system prompt
Context.normalize("Hello", system_prompt: "You are helpful")
#=> {:ok, %Context{messages: [%Message{role: :system}, %Message{role: :user}]}}
# List of mixed types
Context.normalize([%Message{role: :system}, "Hello"])
@spec normalize!( String.t() | ReqLLM.Message.t() | t() | map() | [String.t() | ReqLLM.Message.t() | t() | map()], keyword() ) :: t()
Bang version of normalize/2 that raises on error.
@spec prepend(t(), ReqLLM.Message.t()) :: t()
Prepend a message to the context.
@spec push_assistant(t(), String.t() | [ReqLLM.Message.ContentPart.t()], map()) :: t()
Append an assistant message to the context.
@spec push_system(t(), String.t() | [ReqLLM.Message.ContentPart.t()], map()) :: t()
Prepend a system message to the context.
@spec push_user(t(), String.t() | [ReqLLM.Message.ContentPart.t()], map()) :: t()
Append a user message to the context.
@spec system([ReqLLM.Message.ContentPart.t()] | String.t(), map()) :: ReqLLM.Message.t()
Shortcut for a system message; accepts a string or content parts list.
@spec text(atom(), String.t(), map()) :: ReqLLM.Message.t()
Build a text-only message for the given role.
@spec to_list(t()) :: [ReqLLM.Message.t()]
Return the underlying message list.
@spec tool([ReqLLM.Message.ContentPart.t()] | String.t(), map()) :: ReqLLM.Message.t()
Shortcut for a tool message; accepts a string or content parts list.
@spec tool_result_message(String.t(), String.t(), term(), map()) :: ReqLLM.Message.t()
Build a tool result message.
@spec user([ReqLLM.Message.ContentPart.t()] | String.t(), map()) :: ReqLLM.Message.t()
Shortcut for a user message; accepts a string or content parts list.
Validate context: ensures valid messages and at most one system message.
Bang version of validate/1; raises ReqLLM.Error.Validation.Error on invalid context.
@spec with_image(atom(), String.t(), String.t(), map()) :: ReqLLM.Message.t()
Build a message with text and an image URL for the given role.
@spec wrap(t(), ReqLLM.Model.t()) :: term()
Wrap a context with provider-specific tagged struct.
Takes a ReqLLM.Context
and ReqLLM.Model
and wraps the context
in the appropriate provider-specific struct for encoding/decoding.
Parameters
context
- AReqLLM.Context
to wrapmodel
- AReqLLM.Model
indicating the provider
Returns
- Provider-specific tagged struct ready for encoding
Examples
context = ReqLLM.Context.new([ReqLLM.Context.user("Hello")])
model = ReqLLM.Model.from("anthropic:claude-3-haiku-20240307")
tagged = ReqLLM.Context.wrap(context, model)
#=> %ReqLLM.Providers.Anthropic.Context{context: context}