# `Planck.AI.Model`
[🔗](https://github.com/alexdesousa/planck/blob/v0.1.0/lib/planck/ai/model.ex#L1)

Represents an LLM model and its metadata.

The `base_url` field is used for self-hosted or OpenAI-compatible endpoints
(llama.cpp, vLLM, LM Studio). When `nil`, the provider's default endpoint is used.

## Examples

    iex> %Planck.AI.Model{
    ...>   id: "claude-sonnet-4-6",
    ...>   name: "Claude Sonnet 4.6",
    ...>   provider: :anthropic,
    ...>   context_window: 200_000,
    ...>   max_tokens: 8096,
    ...>   supports_thinking: true,
    ...>   input_types: [:text, :image]
    ...> }

# `cost`

```elixir
@type cost() :: %{
  input: float(),
  output: float(),
  cache_read: float(),
  cache_write: float()
}
```

# `provider`

```elixir
@type provider() :: :anthropic | :openai | :google | :ollama | :llama_cpp
```

# `t`

```elixir
@type t() :: %Planck.AI.Model{
  api_key: String.t() | nil,
  base_url: String.t() | nil,
  context_window: pos_integer(),
  cost: cost(),
  default_opts: keyword(),
  id: String.t(),
  input_types: [:text | :image | :image_url | :file | :video_url],
  max_tokens: pos_integer(),
  name: String.t(),
  provider: provider(),
  supports_thinking: boolean()
}
```

# `providers`

```elixir
@spec providers() :: [provider()]
```

Returns the list of supported provider atoms.

---

*Consult [api-reference.md](api-reference.md) for complete listing*
