ReqLLM.Providers.Anthropic.Response (ReqLLM v1.0.0)
View SourceAnthropic-specific response decoding for the Messages API format.
Handles decoding Anthropic Messages API responses to ReqLLM structures.
Anthropic Response Format
%{
"id" => "msg_01XFDUDYJgAACzvnptvVoYEL",
"type" => "message",
"role" => "assistant",
"model" => "claude-3-5-sonnet-20241022",
"content" => [
%{"type" => "text", "text" => "Hello! How can I help you today?"}
],
"stop_reason" => "stop",
"stop_sequence" => nil,
"usage" => %{
"input_tokens" => 10,
"output_tokens" => 20
}
}Streaming Format
Anthropic uses Server-Sent Events (SSE) with different event types:
- message_start: Initial message metadata
- content_block_start: Start of content block
- content_block_delta: Incremental content
- content_block_stop: End of content block
- message_delta: Final message updates
- message_stop: End of message
Summary
Functions
Decode Anthropic response data to ReqLLM.Response.
Decode Anthropic SSE event data into StreamChunks.
Functions
@spec decode_response(map(), ReqLLM.Model.t()) :: {:ok, ReqLLM.Response.t()} | {:error, term()}
Decode Anthropic response data to ReqLLM.Response.
@spec decode_stream_event(map(), ReqLLM.Model.t()) :: [ReqLLM.StreamChunk.t()]
Decode Anthropic SSE event data into StreamChunks.