# `Agentic.Protocol.LLM`

LLM protocol implementation that wraps existing callback-based LLM calls.

This protocol uses the existing `llm_chat` callback pattern, making it
compatible with existing Agentic integrations. It provides the same
interface as other protocols but delegates to the callbacks.

# `estimate_cost`

# `get_usage`

# `stream_message`

---

*Consult [api-reference.md](api-reference.md) for complete listing*
