ReqLLM.Providers.AmazonBedrock.OpenAI (ReqLLM v1.0.0)

View Source

OpenAI model family support for AWS Bedrock.

Handles OpenAI's OSS models (gpt-oss-120b, gpt-oss-20b) on AWS Bedrock.

This module acts as a thin adapter between Bedrock's AWS-specific wrapping and OpenAI's native Chat Completions format.

Summary

Functions

Extracts usage metadata from the response body.

Formats a ReqLLM context into OpenAI request format for Bedrock.

Parses OpenAI response from Bedrock into ReqLLM format.

Parses a streaming chunk for OpenAI models.

Returns whether this model family supports toolChoice in Bedrock Converse API.

Functions

extract_usage(body, arg2)

Extracts usage metadata from the response body.

Delegates to standard OpenAI usage extraction.

format_request(model_id, context, opts)

Formats a ReqLLM context into OpenAI request format for Bedrock.

Uses standard OpenAI Chat Completions format with Bedrock-specific restrictions:

  • Tool response messages must NOT include the "name" field (Bedrock limitation)

For :object operations, creates a synthetic "structured_output" tool to leverage tool calling for structured JSON output.

parse_response(body, opts)

Parses OpenAI response from Bedrock into ReqLLM format.

Manually decodes the OpenAI Chat Completions format.

For :object operations, extracts the structured output from the tool call.

parse_stream_chunk(chunk, opts)

Parses a streaming chunk for OpenAI models.

Unwraps the Bedrock-specific encoding then delegates to standard OpenAI SSE event parsing. Handles Bedrock-specific usage metrics.

supports_converse_tool_choice?()

Returns whether this model family supports toolChoice in Bedrock Converse API.