ReqLLM.Providers.Anthropic.Context (ReqLLM v1.0.0)

View Source

Anthropic-specific context encoding for the Messages API format.

Handles encoding ReqLLM contexts to Anthropic's Messages API format.

Key Differences from OpenAI

  • Uses content blocks instead of simple strings
  • System messages are extracted to top-level system parameter
  • Tool calls are represented as content blocks with type "tool_use"
  • Tool results must be in "user" role messages (Anthropic only accepts "user" or "assistant" roles)
  • Different parameter names (stop_sequences vs stop)

Message Format

%{
  model: "claude-3-5-sonnet-20241022",
  system: "You are a helpful assistant",
  messages: [
    %{role: "user", content: "What's the weather?"},
    %{role: "assistant", content: [
      %{type: "text", text: "I'll check that for you."},
      %{type: "tool_use", id: "toolu_123", name: "get_weather", input: %{location: "SF"}}
    ]},
    %{role: "user", content: [
      %{type: "tool_result", tool_use_id: "toolu_123", content: "72°F and sunny"}
    ]}
  ],
  max_tokens: 1000,
  temperature: 0.7
}

Summary

Functions

Encode context and model to Anthropic Messages API format.

Functions

encode_request(context, model)

@spec encode_request(ReqLLM.Context.t(), ReqLLM.Model.t() | map()) :: map()

Encode context and model to Anthropic Messages API format.