ReqLLM.Providers.AmazonBedrock.Meta (ReqLLM v1.0.0-rc.8)
View SourceMeta Llama model family support for AWS Bedrock.
Handles Meta's Llama models (Llama 3, 3.1, 3.2, 3.3, 4) on AWS Bedrock.
This module acts as a thin adapter between Bedrock's AWS-specific wrapping
and Meta's native Llama format. It delegates to ReqLLM.Providers.Meta for
all format conversion and response parsing.
Native Format vs OpenAI-Compatible
Unlike most cloud providers (Azure, Google Cloud Vertex AI) and self-hosted
deployments (vLLM, Ollama) which wrap Llama in OpenAI-compatible APIs,
AWS Bedrock uses Meta's native format with prompt, max_gen_len,
and generation fields.
This is why this module delegates to the generic ReqLLM.Providers.Meta
rather than ReqLLM.Providers.OpenAI.
Summary
Functions
Extracts usage metadata from the response body.
Formats messages into Llama 3 prompt format.
Formats a ReqLLM context into Meta Llama request format for Bedrock.
Parses Meta Llama response from Bedrock into ReqLLM format.
Parses a streaming chunk for Meta Llama models.
Returns whether this model family supports toolChoice in Bedrock Converse API.
Functions
Extracts usage metadata from the response body.
Delegates to the generic Meta provider for usage extraction.
Formats messages into Llama 3 prompt format.
Delegates to the generic Meta provider. Exposed for testing.
Formats a ReqLLM context into Meta Llama request format for Bedrock.
Delegates to the generic Meta provider and returns the formatted request.
Parses Meta Llama response from Bedrock into ReqLLM format.
Delegates to the generic Meta provider for parsing.
Parses a streaming chunk for Meta Llama models.
Each chunk contains a "generation" field with the next text segment. Unwraps Bedrock's AWS Event Stream encoding before processing.
Returns whether this model family supports toolChoice in Bedrock Converse API.