ReqLLM.Providers.AmazonBedrock.OpenAI (ReqLLM v1.0.0)
View SourceOpenAI model family support for AWS Bedrock.
Handles OpenAI's OSS models (gpt-oss-120b, gpt-oss-20b) on AWS Bedrock.
This module acts as a thin adapter between Bedrock's AWS-specific wrapping and OpenAI's native Chat Completions format.
Summary
Functions
Extracts usage metadata from the response body.
Formats a ReqLLM context into OpenAI request format for Bedrock.
Parses OpenAI response from Bedrock into ReqLLM format.
Parses a streaming chunk for OpenAI models.
Returns whether this model family supports toolChoice in Bedrock Converse API.
Functions
Extracts usage metadata from the response body.
Delegates to standard OpenAI usage extraction.
Formats a ReqLLM context into OpenAI request format for Bedrock.
Uses standard OpenAI Chat Completions format with Bedrock-specific restrictions:
- Tool response messages must NOT include the "name" field (Bedrock limitation)
For :object operations, creates a synthetic "structured_output" tool to leverage tool calling for structured JSON output.
Parses OpenAI response from Bedrock into ReqLLM format.
Manually decodes the OpenAI Chat Completions format.
For :object operations, extracts the structured output from the tool call.
Parses a streaming chunk for OpenAI models.
Unwraps the Bedrock-specific encoding then delegates to standard OpenAI SSE event parsing. Handles Bedrock-specific usage metrics.
Returns whether this model family supports toolChoice in Bedrock Converse API.