ReqLLM.Providers.OpenAI.ChatAPI (ReqLLM v1.0.0-rc.8)

View Source

OpenAI Chat Completions API driver.

Implements the ReqLLM.Providers.OpenAI.API behaviour for OpenAI's Chat Completions endpoint.

Endpoint

/v1/chat/completions

Supported Models

  • GPT-4 family: gpt-4o, gpt-4-turbo, gpt-4
  • GPT-3.5 family: gpt-3.5-turbo
  • Embedding models: text-embedding-3-small, text-embedding-3-large, text-embedding-ada-002
  • Other chat-based models with "api": "chat" metadata

Capabilities

  • Streaming: Full SSE support with usage tracking via stream_options
  • Tools: Function calling with tool_choice format conversion
  • Embeddings: Dimension and encoding format control
  • Multi-modal: Text and image inputs
  • Token limits: Automatic handling of max_tokens vs max_completion_tokens

Encoding Specifics

  • Converts internal tool_choice format to OpenAI's function-based format
  • Adds stream_options: {include_usage: true} for streaming usage metrics
  • Handles reasoning model parameter requirements (max_completion_tokens)
  • Supports embedding-specific options (dimensions, encoding_format)

Decoding

Uses default OpenAI Chat Completions response format:

  • Standard message structure with role/content
  • Tool calls in OpenAI's native format
  • Usage metrics: input_tokens, output_tokens, total_tokens