LLMDB.Sources.OpenAI (LLM DB v2026.3.0)

Copy Markdown View Source

Remote source for OpenAI models (https://api.openai.com/v1/models).

  • pull/1 fetches data from OpenAI API and caches locally
  • load/1 reads from cached file (no network call)

Options

  • :url - API endpoint (default: "https://api.openai.com/v1/models")
  • :api_key - OpenAI API key (required, or set OPENAI_API_KEY env var)
  • :organization - Optional OpenAI organization ID
  • :project - Optional OpenAI project ID
  • :req_opts - Additional Req options for testing

Configuration

Cache directory can be configured in application config:

config :llm_db,
  openai_cache_dir: "priv/llm_db/remote"

Default: "priv/llm_db/remote"

Usage

# Pull remote data and cache (requires API key)
mix llm_db.pull --source openai

# Load from cache
{:ok, data} = OpenAI.load(%{})

Summary

Functions

Transforms OpenAI API response to canonical Zoi format.

Functions

transform(content)

Transforms OpenAI API response to canonical Zoi format.

Input Format (OpenAI)

{
  "object": "list",
  "data": [
    {
      "id": "gpt-4",
      "object": "model",
      "created": 1686935002,
      "owned_by": "openai"
    }
  ]
}

Output Format (Canonical Zoi)

%{
  "openai" => %{
    id: :openai,
    name: "OpenAI",
    models: [
      %{
        id: "gpt-4",
        provider: :openai,
        extra: %{
          created: 1686935002,
          owned_by: "openai"
        }
      }
    ]
  }
}