# llm_core v0.3.0 - API Reference

## Modules

- [LlmCore.Agent.Components.BudgetGuard](LlmCore.Agent.Components.BudgetGuard.md): Enforces iteration budget limits.
- [LlmCore.Agent.Components.DispatchTools](LlmCore.Agent.Components.DispatchTools.md): Executes validated tool calls via the resolver function.
- [LlmCore.Agent.Components.InjectResults](LlmCore.Agent.Components.InjectResults.md): Builds the messages to append for the next LLM turn.
- [LlmCore.Agent.Components.LoopDecision](LlmCore.Agent.Components.LoopDecision.md): Determines whether the outer loop should continue or stop.
- [LlmCore.Agent.Components.ParseToolCalls](LlmCore.Agent.Components.ParseToolCalls.md): Extracts tool calls from the LLM response.
- [LlmCore.Agent.Components.ValidateCalls](LlmCore.Agent.Components.ValidateCalls.md): Validates parsed tool calls against available tool definitions.
- [LlmCore.Agent.Pipeline.Iteration](LlmCore.Agent.Pipeline.Iteration.md): ALF pipeline for processing a single agentic loop iteration.
- [LlmCore.Agent.Pipeline.ToolDispatch](LlmCore.Agent.Pipeline.ToolDispatch.md): ALF pipeline for orchestrated tool dispatch.
- [LlmCore.Agent.ToolDispatch.Components.BuildPlan](LlmCore.Agent.ToolDispatch.Components.BuildPlan.md): Evaluates the dispatch recipe to produce an execution plan.
- [LlmCore.Agent.ToolDispatch.Components.CollectResults](LlmCore.Agent.ToolDispatch.Components.CollectResults.md): Composer that collects parallel execution results back into a single event.
- [LlmCore.Agent.ToolDispatch.Components.ComposeOutput](LlmCore.Agent.ToolDispatch.Components.ComposeOutput.md): Formats collected results into a single consolidated output string.
- [LlmCore.Agent.ToolDispatch.Components.DirectResolve](LlmCore.Agent.ToolDispatch.Components.DirectResolve.md): Executes a tool call directly via the resolver function.
- [LlmCore.Agent.ToolDispatch.Components.ExecuteOneCall](LlmCore.Agent.ToolDispatch.Components.ExecuteOneCall.md): Executes a single parallel sub-tool call.
- [LlmCore.Agent.ToolDispatch.Components.ExecuteSerial](LlmCore.Agent.ToolDispatch.Components.ExecuteSerial.md): Executes serial tool call steps sequentially.
- [LlmCore.Agent.ToolDispatch.Components.FanOutParallel](LlmCore.Agent.ToolDispatch.Components.FanOutParallel.md): Composer that fans out one dispatch event into N parallel call events.
- [LlmCore.Agent.ToolDispatch.Components.ResolveStrategy](LlmCore.Agent.ToolDispatch.Components.ResolveStrategy.md): Determines the dispatch strategy for a tool call.
- [LlmCore.Agent.ToolDispatch.Event](LlmCore.Agent.ToolDispatch.Event.md): Event struct flowing through the ToolDispatch pipeline.
- [LlmCore.Config.Editor](LlmCore.Config.Editor.md): Helpers for reading and mutating `llm_core.toml` files.
- [LlmCore.Executor.Control](LlmCore.Executor.Control.md): Minimal execution control registry used by CLI providers to support HALT semantics.
- [LlmCore.LLM.Anthropic](LlmCore.LLM.Anthropic.md): Anthropic Claude API provider implementing `LlmCore.LLM.Provider`.

- [LlmCore.LLM.Appliance](LlmCore.LLM.Appliance.md): Generic local inference appliance provider (DGX Spark, future devices).
- [LlmCore.LLM.CLIPort](LlmCore.LLM.CLIPort.md): Helpers for running CLI-based LLM providers via `Port`.
- [LlmCore.LLM.CLIProvider.Config](LlmCore.LLM.CLIProvider.Config.md): Configuration for a CLI-based LLM provider.
- [LlmCore.LLM.Messages](LlmCore.LLM.Messages.md): Normalizes prompts into the chat message format used by API providers.
- [LlmCore.LLM.Native](LlmCore.LLM.Native.md): In-process agentic provider — runs the agent loop inside the BEAM VM.
- [LlmCore.LLM.Native.Router](LlmCore.LLM.Native.Router.md): Config-driven provider resolution for the Native agentic loop.
- [LlmCore.LLM.Ollama](LlmCore.LLM.Ollama.md): Ollama provider implementing the `LlmCore.LLM.Provider` behaviour.
- [LlmCore.LLM.OpenAI](LlmCore.LLM.OpenAI.md): OpenAI-compatible API provider implementing the Provider behaviour.
- [LlmCore.LLM.SSEParser](LlmCore.LLM.SSEParser.md): Helper module for parsing Server-Sent Events (SSE) from LLM APIs.

- [LlmCore.Memory.Hindsight.Cache](LlmCore.Memory.Hindsight.Cache.md): Smart caching layer for Hindsight operations.
- [LlmCore.Memory.Hindsight.CircuitBreaker](LlmCore.Memory.Hindsight.CircuitBreaker.md): Circuit breaker pattern for Hindsight MCP connections.
- [LlmCore.Memory.Hindsight.Config](LlmCore.Memory.Hindsight.Config.md): Hindsight-specific configuration with multi-level precedence.
- [LlmCore.Memory.Hindsight.Discovery](LlmCore.Memory.Hindsight.Discovery.md): Auto-discovery of Hindsight API endpoints.
- [LlmCore.Memory.Hindsight.Retry](LlmCore.Memory.Hindsight.Retry.md): Retry logic with exponential backoff for Hindsight operations.
- [LlmCore.Memory.Hindsight.Supervisor](LlmCore.Memory.Hindsight.Supervisor.md): Supervisor for Hindsight MCP integration components.
- [LlmCore.Memory.Hindsight.WriteBuffer](LlmCore.Memory.Hindsight.WriteBuffer.md): Write-behind buffer for Hindsight retain operations.
- [LlmCore.Paths](LlmCore.Paths.md): Cross-project path helpers for llm_core.
- [LlmCore.Pipelines.InferencePipeline.Context](LlmCore.Pipelines.InferencePipeline.Context.md): Carries inference data through the ALF inference pipeline.

- [LlmCore.Pipelines.RoutingPipeline.Context](LlmCore.Pipelines.RoutingPipeline.Context.md): Internal state passed through the routing pipeline stages.

- [LlmCore.Structured.InstructorAdapter](LlmCore.Structured.InstructorAdapter.md): Placeholder adapter returned when the optional `Instructor` dependency
is not available.

- [LlmCore.Structured.JsonMode](LlmCore.Structured.JsonMode.md): Helpers for working with providers that support JSON-mode outputs.
- [LlmCore.Structured.Validator](LlmCore.Structured.Validator.md): Normalizes schema declarations and validates decoded structured data.
- [LlmCore.Telemetry](LlmCore.Telemetry.md): Helper utilities for emitting telemetry events from llm_core.

- [LlmCore.Telemetry.Logger](LlmCore.Telemetry.Logger.md): Simple telemetry handler that logs llm_core pipeline events.

- [LlmCore.Tool.Codec](LlmCore.Tool.Codec.md): Translates between provider-neutral tool structs and provider-specific
wire formats.
- [LlmCore.Tool.Validator](LlmCore.Tool.Validator.md): Validates tool call arguments against a tool's JSON Schema parameters.

- Public API
  - [LlmCore](LlmCore.md): Public facade for LlmCore capabilities.

- Providers
  - [LlmCore.LLM.CLIProvider](LlmCore.LLM.CLIProvider.md): Universal CLI-based LLM provider.
  - [LlmCore.LLM.Error](LlmCore.LLM.Error.md): Standardized error struct for all LLM providers.
  - [LlmCore.LLM.Provider](LlmCore.LLM.Provider.md): Behaviour module defining the contract for LLM providers.
  - [LlmCore.LLM.Response](LlmCore.LLM.Response.md): Standardized response struct for all LLM providers.

- Routing
  - [LlmCore.Router](LlmCore.Router.md): GenServer that resolves task types to full LLM agent configurations.
  - [LlmCore.Router.ResolvedRoute](LlmCore.Router.ResolvedRoute.md): Fully-realized routing decision containing the agent metadata.

  - [LlmCore.Router.RouteEntry](LlmCore.Router.RouteEntry.md): Represents a routing rule entry mapping a task type to an agent alias.

  - [LlmCore.Router.RoutingTable](LlmCore.Router.RoutingTable.md): In-memory representation of the routing rules loaded from YAML.

- Configuration
  - [LlmCore.Config.Loader](LlmCore.Config.Loader.md): Loads llm_core configuration files (routing, providers, etc.) from disk and updates the store.
  - [LlmCore.Config.Store](LlmCore.Config.Store.md): Lightweight ETS-backed storage for runtime configuration.
  - [LlmCore.Config.Watcher](LlmCore.Config.Watcher.md): Watches configuration directories for changes and triggers reloads.

- Agent Loop
  - [LlmCore.Agent](LlmCore.Agent.md): Agent struct representing a registered LLM provider with a human-friendly name.
  - [LlmCore.Agent.Context](LlmCore.Agent.Context.md): Carries data through the agentic iteration pipeline.
  - [LlmCore.Agent.Loop](LlmCore.Agent.Loop.md): Agentic tool-calling loop.
  - [LlmCore.Agent.Registry](LlmCore.Agent.Registry.md): GenServer for runtime agent management and discovery.

- Structured Output
  - [LlmCore.Structured](LlmCore.Structured.md): Utilities for extracting structured data from LLM responses.

- Memory
  - [LlmCore.Memory.Hindsight](LlmCore.Memory.Hindsight.md): Hindsight 0.4+ integration for semantic memory capabilities.

- Registries
  - [LlmCore.CLIProvider.Registry](LlmCore.CLIProvider.Registry.md): Query surface for CLI-based LLM providers.
  - [LlmCore.Provider.Definition](LlmCore.Provider.Definition.md): Normalized provider metadata loaded from TOML configuration and runtime discovery.
  - [LlmCore.Provider.Registry](LlmCore.Provider.Registry.md): In-memory accessors for provider definitions loaded from TOML configuration.

- Pipelines
  - [LlmCore.Pipelines.InferencePipeline](LlmCore.Pipelines.InferencePipeline.md): ALF pipeline that normalizes a request, resolves routing, and dispatches it
to the selected provider using either blocking or streaming mode.

  - [LlmCore.Pipelines.MemoryPipeline](LlmCore.Pipelines.MemoryPipeline.md): ALF pipeline orchestrating all Hindsight memory operations (retain, recall,
reflect). It centralizes caching, circuit breaker gating, retries, and async
buffering to match the architecture requirements of llm_core.

  - [LlmCore.Pipelines.RoutingPipeline](LlmCore.Pipelines.RoutingPipeline.md): ALF pipeline that resolves task types to provider/agent configurations.

## Mix Tasks

- [mix llm_core.bench](Mix.Tasks.LlmCore.Bench.md): Runs ALF routing and inference pipeline benchmarks.
- [mix llm_core.config.set](Mix.Tasks.LlmCore.Config.Set.md): Mutates `llm_core.toml` entries and reloads the runtime configuration.
- [mix llm_core.config.show](Mix.Tasks.LlmCore.Config.Show.md): Displays llm_core configuration loaded from `llm_core.toml` (merged with
overrides).
- [mix llm_core.config.validate](Mix.Tasks.LlmCore.Config.Validate.md): Validates `llm_core.toml` configuration and prints provider availability.

