ReqLLM.Context (ReqLLM v1.0.0-rc.3)

View Source

Context represents a conversation history as a collection of messages.

Provides canonical message constructor functions that can be imported for clean, readable message creation. Supports standard roles: :user, :assistant, :system, and :tool.

Example

import ReqLLM.Context

context = Context.new([
  system("You are a helpful assistant"),
  user("What's the weather like?"),
  assistant("I'll check that for you")
])

Context.validate!(context)

Summary

Functions

Shortcut for an assistant message; accepts a string or content parts list.

Build a message from role and content parts (metadata optional).

Merges the original context with a response to create an updated context.

Create a new Context from a list of messages (defaults to empty).

Normalize any "prompt-ish" input into a validated ReqLLM.Context.

Bang version of normalize/2 that raises on error.

Shortcut for a system message; accepts a string or content parts list.

Build a text-only message for the given role.

Return the underlying message list.

Shortcut for a user message; accepts a string or content parts list.

Validate context: ensures valid messages and at most one system message.

Bang version of validate/1; raises ReqLLM.Error.Validation.Error on invalid context.

Build a message with text and an image URL for the given role.

Wrap a context with provider-specific tagged struct.

Types

t()

@type t() :: %ReqLLM.Context{messages: [ReqLLM.Message.t()]}

Functions

assistant(content, meta \\ %{})

@spec assistant([ReqLLM.Message.ContentPart.t()] | String.t(), map()) ::
  ReqLLM.Message.t()

Shortcut for an assistant message; accepts a string or content parts list.

build(role, content, meta \\ %{})

Build a message from role and content parts (metadata optional).

merge_response(context, response)

@spec merge_response(t(), ReqLLM.Response.t()) :: ReqLLM.Response.t()

Merges the original context with a response to create an updated context.

Takes a context and a response, then creates a new context containing the original messages plus the assistant response message.

Parameters

  • context - Original ReqLLM.Context
  • response - ReqLLM.Response containing the assistant message

Returns

  • Updated response with merged context

Examples

context = ReqLLM.Context.new([user("Hello")])
response = %ReqLLM.Response{message: assistant("Hi there!")}
updated_response = ReqLLM.Context.merge_response(context, response)
# response.context now contains both user and assistant messages

new(list \\ [])

@spec new([ReqLLM.Message.t()]) :: t()

Create a new Context from a list of messages (defaults to empty).

normalize(prompt, opts \\ [])

@spec normalize(
  String.t()
  | ReqLLM.Message.t()
  | t()
  | map()
  | [String.t() | ReqLLM.Message.t() | t() | map()],
  keyword()
) :: {:ok, t()} | {:error, term()}

Normalize any "prompt-ish" input into a validated ReqLLM.Context.

Accepts various input types and converts them to a proper Context struct:

  • String: converts to user message
  • Message struct: wraps in Context
  • Context struct: passes through
  • List: processes each item and creates Context from all messages
  • Loose maps: converts to Message if they have role/content keys

Options

  • :system_prompt - String to add as system message if none exists
  • :validate - Boolean to run validation (default: true)
  • :convert_loose - Boolean to allow loose maps with role/content (default: true)

Examples

# String to user message
Context.normalize("Hello")
#=> {:ok, %Context{messages: [%Message{role: :user, content: [%ContentPart{text: "Hello"}]}]}}

# Add system prompt
Context.normalize("Hello", system_prompt: "You are helpful")
#=> {:ok, %Context{messages: [%Message{role: :system}, %Message{role: :user}]}}

# List of mixed types
Context.normalize([%Message{role: :system}, "Hello"])

normalize!(prompt, opts \\ [])

@spec normalize!(
  String.t()
  | ReqLLM.Message.t()
  | t()
  | map()
  | [String.t() | ReqLLM.Message.t() | t() | map()],
  keyword()
) :: t()

Bang version of normalize/2 that raises on error.

system(content, meta \\ %{})

Shortcut for a system message; accepts a string or content parts list.

text(role, content, meta \\ %{})

@spec text(atom(), String.t(), map()) :: ReqLLM.Message.t()

Build a text-only message for the given role.

to_list(context)

@spec to_list(t()) :: [ReqLLM.Message.t()]

Return the underlying message list.

user(content, meta \\ %{})

Shortcut for a user message; accepts a string or content parts list.

validate(context)

@spec validate(t()) :: {:ok, t()} | {:error, String.t()}

Validate context: ensures valid messages and at most one system message.

validate!(context)

@spec validate!(t()) :: t()

Bang version of validate/1; raises ReqLLM.Error.Validation.Error on invalid context.

with_image(role, text, url, meta \\ %{})

@spec with_image(atom(), String.t(), String.t(), map()) :: ReqLLM.Message.t()

Build a message with text and an image URL for the given role.

wrap(ctx, model)

@spec wrap(t(), ReqLLM.Model.t()) :: term()

Wrap a context with provider-specific tagged struct.

Takes a ReqLLM.Context and ReqLLM.Model and wraps the context in the appropriate provider-specific struct for encoding/decoding.

Parameters

Returns

  • Provider-specific tagged struct ready for encoding

Examples

context = ReqLLM.Context.new([ReqLLM.Context.user("Hello")])
model = ReqLLM.Model.from("anthropic:claude-3-haiku-20240307")
tagged = ReqLLM.Context.wrap(context, model)
#=> %ReqLLM.Providers.Anthropic.Context{context: context}