# `LlamaCppEx.ChatCompletion`
[🔗](https://github.com/nyo16/llama_cpp_ex/blob/main/lib/llama_cpp_ex/chat_completion.ex#L1)

OpenAI-compatible chat completion response struct.

Mirrors the shape of `POST /v1/chat/completions` responses.

# `choice`

```elixir
@type choice() :: %{
  index: integer(),
  message: %{
    role: String.t(),
    content: String.t(),
    reasoning_content: String.t() | nil
  },
  finish_reason: String.t()
}
```

# `t`

```elixir
@type t() :: %LlamaCppEx.ChatCompletion{
  choices: [choice()],
  created: integer(),
  id: String.t(),
  model: String.t(),
  object: String.t(),
  usage: usage()
}
```

# `usage`

```elixir
@type usage() :: %{
  prompt_tokens: integer(),
  completion_tokens: integer(),
  total_tokens: integer()
}
```

---

*Consult [api-reference.md](api-reference.md) for complete listing*
