# `Agentic.LLM.Transport.OpenAIChatCompletions`

Transport for the OpenAI Chat Completions wire format
(`POST {base_url}/chat/completions`).

This is the lingua franca of the OpenAI-compatible provider zoo:
OpenAI itself, OpenRouter, Groq, Together, Fireworks, Cerebras,
Mistral, DeepSeek, LM Studio, vLLM, … all speak it. The transport
knows nothing about any of those providers individually — the base
URL and any provider-specific headers are supplied via `opts`.

## What lives here vs. in a shim

The transport owns:

  * canonical params -> wire request body translation
    (messages, tools, tool_choice)
  * wire response -> `Agentic.LLM.Response` translation
    (choices/message/content -> content blocks,
     tool_calls -> `:tool_use` blocks,
     finish_reason -> `:end_turn | :tool_use | :max_tokens`)
  * rate-limit header parsing
  * HTTP error parsing into `Agentic.LLM.Error` with phase-1
    classification

The shim owns the api key, base URL, and any provider-specific
headers (`HTTP-Referer`, `X-Title`, …). It also performs the
actual `Req.post` call.

---

*Consult [api-reference.md](api-reference.md) for complete listing*
