Arcana.LLM protocol (Arcana v1.3.3)

View Source

Protocol for LLM adapters used by Arcana.

Arcana accepts any LLM that implements this protocol. Built-in implementations:

  • Model strings via Req.LLM (e.g., "openai:gpt-4o-mini", "zai:glm-4.5-flash")
  • Tuples of {model_string, opts} for passing options like :api_key
  • Anonymous functions (for testing)

Examples

# Model string (requires req_llm)
Arcana.ask("question", llm: "openai:gpt-4o-mini", repo: MyApp.Repo)

# With options
Arcana.ask("question", llm: {"zai:glm-4.7", api_key: "key"}, repo: MyApp.Repo)

# Function (for testing)
Arcana.ask("question", llm: fn _prompt -> {:ok, "answer"} end, repo: MyApp.Repo)

Summary

Types

t()

All the types that implement this protocol.

Functions

Completes a prompt with the given context and options.

Types

t()

@type t() :: term()

All the types that implement this protocol.

Functions

complete(llm, prompt, context, opts)

Completes a prompt with the given context and options.

Returns {:ok, response} or {:error, reason}.