LLMDB (LLM DB v2025.12.4)
View SourceFast, persistent_term-backed LLM model metadata catalog.
Provides a simple, capability-aware API for querying LLM model metadata.
All queries are backed by :persistent_term for O(1), lock-free access.
Model Specs
Model specifications can be expressed in multiple formats:
"provider:model"(e.g.,"openai:gpt-4o-mini") - Traditional colon format"model@provider"(e.g.,"gpt-4o-mini@openai") - Filesystem-safe @ format{:provider, "model"}(e.g.,{:openai, "gpt-4o-mini"}) - Tuple format
See the Model Spec Formats guide for detailed information on when to use each format.
Two Phases
Phase 1 - Build Time (Mix tasks):
mix llm_db.pull- Pull sources and run ETL pipeline to generate snapshot.json- This is a development/CI operation that builds the complete catalog
Phase 2 - Runtime (Consumer library):
load/1- Load packaged snapshot into Store with optional filtering- Query functions to select models by capabilities
- All queries operate on the filtered catalog loaded in Store
Providers
providers/0- Get all providers as list of Provider structsprovider/1- Get a specific provider by ID
Models
models/0- Get all models as list of Model structsmodels/1- Get all models for a providermodel/1- Parse "provider:model" spec and get modelmodel/2- Get a specific model by provider and ID
Selection and Policy
select/1- Select first model matching capability requirementscandidates/1- Get all models matching capability requirementsallowed?/1- Check if a model is in the filtered catalogcapabilities/1- Get capabilities map for a model
Utilities
parse/1,2- Parse a model spec string (colon or @ format) into {provider, model_id} tupleparse!/1,2- Parse a model spec string, raising on errorformat/1,2- Format a {provider, model_id} tuple as a stringbuild/1,2- Build a spec string from various inputs, converting between formats
Examples
# Get all providers
providers = LLMDB.providers()
# Get a specific provider
{:ok, provider} = LLMDB.provider(:openai)
# Get all models for a provider
models = LLMDB.models(:openai)
# Get a specific model
{:ok, model} = LLMDB.model(:openai, "gpt-4o-mini")
# Parse spec and get model
{:ok, model} = LLMDB.model("openai:gpt-4o-mini")
# Select a model matching requirements
{:ok, {:openai, "gpt-4o-mini"}} = LLMDB.select(
require: [chat: true, tools: true, json_native: true],
prefer: [:openai, :anthropic]
)
# Check if a model is allowed
true = LLMDB.allowed?({:openai, "gpt-4o-mini"})
Summary
Functions
Returns true if the model is allowed by current filters.
Builds a model specification string from various inputs.
Returns all models matching capability requirements.
Gets capabilities for a model spec.
Formats a model spec tuple as a string.
Loads or reloads the LLM model catalog.
Loads an empty catalog with no providers or models.
Parses model spec string and returns the model.
Returns a specific model by provider and ID (filtered).
Returns all models across all providers (filtered).
Returns all models for a specific provider (filtered).
Parses a model spec string into a {provider, model_id} tuple.
Parses a model spec string, raising on error.
Returns a specific provider by ID.
Returns all providers as a list of Provider structs.
Selects the first model matching capability requirements.
Types
Functions
@spec allowed?(model_spec()) :: boolean()
Returns true if the model is allowed by current filters.
Checks if the model is present in the filtered snapshot loaded in Store.
Parameters
spec- Either%Model{},{provider, model_id}tuple, or"provider:model"string
Returns
true if model is in filtered catalog, false otherwise
Examples
true = LLMDB.allowed?({:openai, "gpt-4o-mini"})
true = LLMDB.allowed?("openai:gpt-4o-mini")
{:ok, model} = LLMDB.model(:openai, "gpt-4o-mini")
true = LLMDB.allowed?(model)
Builds a model specification string from various inputs.
Accepts strings (in any supported format) or tuples and outputs a string in the desired format. Useful for converting between formats.
Parameters
input- Model spec as string or tupleopts- Keyword list with optional:formatfor output format
Examples
"gpt-4@openai" = LLMDB.build("openai:gpt-4", format: :filename_safe)
"openai:gpt-4" = LLMDB.build("gpt-4@openai", format: :provider_colon_model)
"gpt-4@openai" = LLMDB.build({:openai, "gpt-4"}, format: :model_at_provider)
Returns all models matching capability requirements.
Delegates to LLMDB.Query.candidates/1.
Options
:require- Keyword list of required capabilities:forbid- Keyword list of forbidden capabilities:prefer- List of provider atoms in preference order:scope- Either:all(default) or a specific provider atom
Returns
List of {provider, model_id} tuples matching the criteria.
Examples
candidates = LLMDB.candidates(
require: [chat: true, tools: true],
prefer: [:openai, :anthropic]
)
@spec capabilities(model_spec()) :: map() | nil
Gets capabilities for a model spec.
Delegates to LLMDB.Query.capabilities/1.
Parameters
spec- Either{provider, model_id}tuple,"provider:model"string, or%Model{}struct
Examples
caps = LLMDB.capabilities({:openai, "gpt-4o-mini"})
#=> %{chat: true, tools: %{enabled: true, ...}, ...}
Formats a model spec tuple as a string.
Converts a {provider, model_id} tuple to string format. The output format can be
controlled via the format parameter or falls back to the application config
:llm_db, :model_spec_format (default: :provider_colon_model).
Parameters
spec- {provider, model_id} tupleformat- Optional format override (atom)
Supported Formats
:provider_colon_model- "provider:model" (default):model_at_provider- "model@provider" (filename-safe):filename_safe- alias for:model_at_provider
Examples
"openai:gpt-4o-mini" = LLMDB.format({:openai, "gpt-4o-mini"})
"gpt-4o-mini@openai" = LLMDB.format({:openai, "gpt-4o-mini"}, :filename_safe)
"gpt-4o-mini@openai" = LLMDB.format({:openai, "gpt-4o-mini"}, :model_at_provider)
Loads or reloads the LLM model catalog.
Phase 2 operation: Loads the packaged snapshot into runtime Store with optional filtering and customization based on consumer configuration.
This function is idempotent - calling it multiple times with the same configuration will not reload the catalog unnecessarily.
Options
Consumer configuration options (override config :llm_db, ... settings):
:allow-:all, list of providers[:openai], or map%{openai: :all | [patterns]}:deny- List of providers[:provider]or map%{provider: [patterns]}:prefer- List of provider atoms in preference order:custom- Map with provider IDs as keys, provider configs (with models) as values
Returns
{:ok, snapshot}- Successfully loaded the catalog{:error, :no_snapshot}- No packaged snapshot available{:error, term}- Other loading errors
Examples
# Load with default configuration from app env
{:ok, _snapshot} = LLMDB.load()
# Load with provider filter
{:ok, _snapshot} = LLMDB.load(allow: [:openai, :anthropic])
# Load with model pattern filters
{:ok, _snapshot} = LLMDB.load(
allow: %{openai: ["gpt-4*"], anthropic: :all},
deny: %{openai: ["gpt-4-0613"]},
prefer: [:anthropic, :openai]
)
# Load with custom providers/models
{:ok, _snapshot} = LLMDB.load(
custom: %{
local: [
name: "Local Provider",
base_url: "http://localhost:8080",
models: %{
"llama-3" => %{capabilities: %{chat: true}},
"mistral-7b" => %{capabilities: %{chat: true, tools: %{enabled: true}}}
}
]
}
)
Loads an empty catalog with no providers or models.
Used as a fallback when no packaged snapshot is available,
allowing the application to start successfully. The catalog can
later be populated via load/1 once a snapshot is available.
Examples
LLMDB.load_empty()
#=> {:ok, %{providers: [], models: %{}, ...}}
@spec model(String.t()) :: {:ok, LLMDB.Model.t()} | {:error, term()}
Parses model spec string and returns the model.
Supports both "provider:model" and "model@provider" formats.
Parameters
spec- Model spec string like"openai:gpt-4o-mini"or"gpt-4o-mini@openai"
Returns
{:ok, model}- Model found{:error, term}- Parse error or model not found
Examples
{:ok, model} = LLMDB.model("openai:gpt-4o-mini")
{:ok, model} = LLMDB.model("gpt-4o-mini@openai")
{:ok, model} = LLMDB.model("anthropic:claude-3-5-sonnet-20241022")
@spec model(provider(), model_id()) :: {:ok, LLMDB.Model.t()} | {:error, term()}
Returns a specific model by provider and ID (filtered).
Parameters
provider- Provider atom (e.g.,:openai)model_id- Model ID string (e.g.,"gpt-4o-mini")
Returns
{:ok, model}- Model found{:error, term}- Model not found
Examples
{:ok, model} = LLMDB.model(:openai, "gpt-4o-mini")
@spec models() :: [LLMDB.Model.t()]
Returns all models across all providers (filtered).
Examples
models = LLMDB.models()
#=> [%LLMDB.Model{}, ...]
@spec models(provider()) :: [LLMDB.Model.t()]
Returns all models for a specific provider (filtered).
Parameters
provider- Provider atom (e.g.,:openai,:anthropic)
Returns
List of Model structs for the provider, or empty list if provider not found.
Examples
models = LLMDB.models(:openai)
#=> [%LLMDB.Model{id: "gpt-4o", ...}, ...]
@spec parse( String.t() | {provider(), model_id()}, keyword() ) :: {:ok, {provider(), model_id()}} | {:error, term()}
Parses a model spec string into a {provider, model_id} tuple.
Supports both "provider:model" (default) and "model@provider" (filename-safe) formats. Automatically detects the format based on separator present.
Parameters
spec- String like"openai:gpt-4o-mini","gpt-4o-mini@openai", or tuple{:openai, "gpt-4o-mini"}opts- Keyword list with optional:formatto explicitly specify:colonor:at
Returns
{:ok, {provider, model_id}}- Successfully parsed spec{:error, term}- Invalid spec format
Examples
{:ok, {:openai, "gpt-4o-mini"}} = LLMDB.parse("openai:gpt-4o-mini")
{:ok, {:openai, "gpt-4o-mini"}} = LLMDB.parse("gpt-4o-mini@openai")
{:ok, {:anthropic, "claude-3-5-sonnet-20241022"}} = LLMDB.parse("anthropic:claude-3-5-sonnet-20241022")
{:ok, {:openai, "gpt-4o"}} = LLMDB.parse({:openai, "gpt-4o"})
# With explicit format when ambiguous
{:ok, {:openai, "model@test"}} = LLMDB.parse("openai:model@test", format: :colon)
Parses a model spec string, raising on error.
Same as parse/2 but raises ArgumentError instead of returning error tuple.
Examples
{:openai, "gpt-4o-mini"} = LLMDB.parse!("openai:gpt-4o-mini")
{:openai, "gpt-4o-mini"} = LLMDB.parse!("gpt-4o-mini@openai")
@spec provider(provider()) :: {:ok, LLMDB.Provider.t()} | {:error, term()}
Returns a specific provider by ID.
Parameters
provider- Provider atom (e.g.,:openai,:anthropic)
Returns
{:ok, provider}- Provider found{:error, term}- Provider not found
Examples
{:ok, provider} = LLMDB.provider(:openai)
@spec providers() :: [LLMDB.Provider.t()]
Returns all providers as a list of Provider structs.
Examples
providers = LLMDB.providers()
#=> [%LLMDB.Provider{id: :anthropic, ...}, ...]
Selects the first model matching capability requirements.
Delegates to LLMDB.Query.select/1.
Options
:require- Keyword list of required capabilities:forbid- Keyword list of forbidden capabilities:prefer- List of provider atoms in preference order:scope- Either:all(default) or a specific provider atom
Returns
{:ok, {provider, model_id}}- First matching model{:error, :no_match}- No models match the criteria
Examples
{:ok, {provider, model_id}} = LLMDB.select(
require: [chat: true, tools: true],
prefer: [:openai, :anthropic]
)
See LLMDB.Query.select/1.