Generic OpenAI-compatible provider.
Works with any API that implements the OpenAI chat completions format: DeepSeek, Mistral, xAI/Grok, Ollama, OpenRouter, Together, Groq, etc.
Config
Required:
:api_url- Base URL (e.g., "https://api.deepseek.com", "https://api.mistral.ai", "https://api.x.ai", "http://localhost:11434"):model- Model name
Optional:
:api_key- API key (omit for local providers like Ollama):max_tokens- Max output tokens (default: 4096):system_prompt- System prompt string:chat_path- Path to completions endpoint (default: "/v1/chat/completions"):extra_headers- Additional headers as[{name, value}]:req_options- Additional options passed to Req
Examples
# DeepSeek
Alloy.run("Hello",
provider: {Alloy.Provider.OpenAICompat,
api_key: System.get_env("DEEPSEEK_API_KEY"),
api_url: "https://api.deepseek.com",
model: "deepseek-chat"
}
)
# Ollama (no API key)
Alloy.run("Hello",
provider: {Alloy.Provider.OpenAICompat,
api_url: "http://localhost:11434",
model: "llama4"
}
)
# xAI chat completions compatibility
Alloy.run("Hello",
provider: {Alloy.Provider.OpenAICompat,
api_key: System.get_env("XAI_API_KEY"),
api_url: "https://api.x.ai",
model: "grok-code-fast-1"
}
)