View Source Nexlm.Providers.Groq (Nexlm v0.1.5)
Provider implementation for Groq's Chat Completion API. Groq API is OpenAI-compatible with some limitations.
Model Names
Models should be prefixed with "groq/", for example:
- "groq/mixtral-8x7b-32768"
- "groq/llama2-70b-4096"
Message Formats
Supports the following message types:
- Text messages: Simple string content
- System messages: Special instructions for model behavior
Note: Image messages are not supported.
Configuration
Required:
- API key in runtime config (:nexlm, Nexlm.Providers.Groq, api_key: "key")
- Model name in request
Optional:
- temperature: Float between 0 and 2 (default: 0.0, will be converted to 1e-8 if 0)
- max_tokens: Integer for response length limit
- top_p: Float between 0 and 1 for nucleus sampling
Limitations
The following OpenAI features are not supported:
- logprobs
- logit_bias
- top_logprobs
- messages[].name
- N > 1 (only single completions supported)