# `GenAI.Provider.DeepSeek`

Module for interacting with the OpenAI API.

# `chat`

# `chat`

# `config_key`

Return config_key inference provide application config stored under :genai entry

# `default_encoder`

# `effective_settings`

Obtain map of effective settings: settings, model_settings, provider_settings, config_settings, etc.

# `endpoint`

Prepare endpoint and method to make inference call to

# `headers`

# `headers`

Prepare request headers

# `models`

Retrieves a list of models supported by the OpenAI API for given user.

# `request_body`

Prepare request body to be passed to inference call.

# `run`

Build and run inference thread

# `standardize_model`

# `stream`

Build and run inference thread in streaming mode

---

*Consult [api-reference.md](api-reference.md) for complete listing*
