# `Agentic.LLM.Provider.Ollama`

Ollama provider — local-first chat and embeddings.

Uses `Agentic.LLM.Transport.Ollama`. The base URL defaults to
`http://localhost:11434` and may be overridden by `OLLAMA_HOST`.
No API key is required; `Credentials.resolve/1` returns a credential
with `api_key: nil` for `:ollama`.

---

*Consult [api-reference.md](api-reference.md) for complete listing*
