Alloy.Provider.OpenAICompat (alloy v0.10.1)

Copy Markdown View Source

Generic OpenAI-compatible provider.

Works with any API that implements the OpenAI chat completions format: DeepSeek, Mistral, xAI/Grok, Ollama, OpenRouter, Together, Groq, etc.

Config

Required:

Optional:

  • :api_key - API key (omit for local providers like Ollama)
  • :max_tokens - Max output tokens (default: 4096)
  • :system_prompt - System prompt string
  • :chat_path - Path to completions endpoint (default: "/v1/chat/completions")
  • :extra_headers - Additional headers as [{name, value}]
  • :req_options - Additional options passed to Req

Examples

# DeepSeek
Alloy.run("Hello",
  provider: {Alloy.Provider.OpenAICompat,
    api_key: System.get_env("DEEPSEEK_API_KEY"),
    api_url: "https://api.deepseek.com",
    model: "deepseek-chat"
  }
)

# Ollama (no API key)
Alloy.run("Hello",
  provider: {Alloy.Provider.OpenAICompat,
    api_url: "http://localhost:11434",
    model: "llama4"
  }
)

# xAI chat completions compatibility
Alloy.run("Hello",
  provider: {Alloy.Provider.OpenAICompat,
    api_key: System.get_env("XAI_API_KEY"),
    api_url: "https://api.x.ai",
    model: "grok-code-fast-1"
  }
)