Remote source for OpenAI models (https://api.openai.com/v1/models).
pull/1fetches data from OpenAI API and caches locallyload/1reads from cached file (no network call)
Options
:url- API endpoint (default: "https://api.openai.com/v1/models"):api_key- OpenAI API key (required, or setOPENAI_API_KEYenv var):organization- Optional OpenAI organization ID:project- Optional OpenAI project ID:req_opts- Additional Req options for testing
Configuration
Cache directory can be configured in application config:
config :llm_db,
openai_cache_dir: "priv/llm_db/remote"Default: "priv/llm_db/remote"
Usage
# Pull remote data and cache (requires API key)
mix llm_db.pull --source openai
# Load from cache
{:ok, data} = OpenAI.load(%{})
Summary
Functions
Transforms OpenAI API response to canonical Zoi format.
Functions
Transforms OpenAI API response to canonical Zoi format.
Input Format (OpenAI)
{
"object": "list",
"data": [
{
"id": "gpt-4",
"object": "model",
"created": 1686935002,
"owned_by": "openai"
}
]
}Output Format (Canonical Zoi)
%{
"openai" => %{
id: :openai,
name: "OpenAI",
models: [
%{
id: "gpt-4",
provider: :openai,
extra: %{
created: 1686935002,
owned_by: "openai"
}
}
]
}
}