# `Arcana.LLM`
[🔗](https://github.com/georgeguimaraes/arcana/blob/main/lib/arcana/llm.ex#L1)

Protocol for LLM adapters used by Arcana.

Arcana accepts any LLM that implements this protocol. Built-in implementations:

- Model strings via Req.LLM (e.g., `"openai:gpt-4o-mini"`, `"zai:glm-4.5-flash"`)
- Tuples of `{model_string, opts}` for passing options like `:api_key`
- Anonymous functions (for testing)

## Examples

    # Model string (requires req_llm)
    Arcana.ask("question", llm: "openai:gpt-4o-mini", repo: MyApp.Repo)

    # With options
    Arcana.ask("question", llm: {"zai:glm-4.7", api_key: "key"}, repo: MyApp.Repo)

    # Function (for testing)
    Arcana.ask("question", llm: fn _prompt -> {:ok, "answer"} end, repo: MyApp.Repo)

# `t`

```elixir
@type t() :: term()
```

All the types that implement this protocol.

# `complete`

Completes a prompt with the given context and options.

Returns `{:ok, response}` or `{:error, reason}`.

---

*Consult [api-reference.md](api-reference.md) for complete listing*
