View Source LangChain.ChatModels.ChatDeepSeek (LangChain v0.4.1)

Module for interacting with DeepSeek models.

DeepSeek provides an API that is compatible with OpenAI's API format, making it easy to integrate with existing OpenAI-based code.

Model Options

DeepSeek supports the following models:

  • deepseek-chat - Non-thinking mode of DeepSeek-V3.2-Exp
  • deepseek-reasoner - Thinking mode of DeepSeek-V3.2-Exp

API Configuration

The DeepSeek API uses the following configuration:

  • Base URL: https://api.deepseek.com (or https://api.deepseek.com/v1 for OpenAI compatibility)
  • Authentication: Bearer token (API key)

Example Usage

# Basic usage
model = ChatDeepSeek.new!(%{
  model: "deepseek-chat",
  api_key: "your-api-key-here"
})

# Using with LLMChain
{:ok, chain} =
  LLMChain.new!(%{llm: model})
  |> LLMChain.add_message(Message.new_user!("Hello!"))

{:ok, response} = LLMChain.run(chain)

Tool Support

DeepSeek supports function calling through the OpenAI-compatible API format. You can use tools in the same way as with OpenAI:

model = ChatDeepSeek.new!(%{
  model: "deepseek-chat",
  api_key: "your-api-key-here"
})

function = Function.new!(%{
  name: "get_weather",
  description: "Get current weather for a location",
  parameters_schema: %{
    "type" => "object",
    "properties" => %{
      "location" => %{
        "type" => "string",
        "description" => "The city and state, e.g. San Francisco, CA"
      }
    },
    "required" => ["location"]
  }
})

Callbacks

See the set of available callbacks: LangChain.Chains.ChainCallbacks

Token Usage

DeepSeek returns token usage information as part of the response body. The LangChain.TokenUsage is added to the metadata of the LangChain.Message and LangChain.MessageDelta structs that are processed under the :usage key.

The TokenUsage data is accumulated for MessageDelta structs and the final usage information will be on the LangChain.Message.

Summary

Functions

Calls the DeepSeek API passing the ChatDeepSeek struct with configuration, plus either a simple message or the list of messages to act as the prompt.

Decode a streamed response from a DeepSeek server. This is the same as the OpenAI implementation since DeepSeek uses an OpenAI-compatible API.

Convert a LangChain Message-based structure to the expected map of data for the DeepSeek API.

Return the params formatted for an API request.

Setup a ChatDeepSeek client configuration.

Setup a ChatDeepSeek client configuration and return it or raise an error if invalid.

Restores the model from the config.

Determine if an error should be retried. If true, a fallback LLM may be used. If false, the error is understood to be more fundamental with the request rather than a service issue and it should not be retried or fallback to another service.

Generate a config map that can later restore the model's configuration.

Types

@type t() :: %LangChain.ChatModels.ChatDeepSeek{
  api_key: term(),
  callbacks: term(),
  endpoint: term(),
  frequency_penalty: term(),
  json_response: term(),
  json_schema: term(),
  logprobs: term(),
  max_tokens: term(),
  model: term(),
  n: term(),
  parallel_tool_calls: term(),
  receive_timeout: term(),
  req_config: term(),
  seed: term(),
  stream: term(),
  stream_options: term(),
  temperature: term(),
  tool_choice: term(),
  top_logprobs: term(),
  user: term(),
  verbose_api: term()
}

Functions

Link to this function

call(deepseek, prompt, tools \\ [])

View Source

Calls the DeepSeek API passing the ChatDeepSeek struct with configuration, plus either a simple message or the list of messages to act as the prompt.

Optionally pass in a list of tools available to the LLM for requesting execution in response.

Optionally pass in a callback function that can be executed as data is received from the API.

NOTE: This function can be used directly, but the primary interface should be through LangChain.Chains.LLMChain. The ChatDeepSeek module is more focused on translating the LangChain data structures to and from the DeepSeek API.

Another benefit of using LangChain.Chains.LLMChain is that it combines the storage of messages, adding tools, adding custom context that should be passed to tools, and automatically applying LangChain.MessageDelta structs as they are are received, then converting those to the full LangChain.Message once fully complete.

Link to this function

decode_stream(arg, done \\ [], depth \\ 0)

View Source
@spec decode_stream({String.t(), String.t()}, list(), non_neg_integer()) ::
  {%{required(String.t()) => any()}} | {:error, LangChain.LangChainError.t()}

Decode a streamed response from a DeepSeek server. This is the same as the OpenAI implementation since DeepSeek uses an OpenAI-compatible API.

Convert a LangChain Message-based structure to the expected map of data for the DeepSeek API.

Link to this function

for_api(deepseek, messages, tools)

View Source
@spec for_api(
  t() | LangChain.Message.t() | LangChain.Function.t(),
  message :: [map()],
  LangChain.ChatModels.ChatModel.tools()
) :: %{required(atom()) => any()}

Return the params formatted for an API request.

@spec new(attrs :: map()) :: {:ok, t()} | {:error, Ecto.Changeset.t()}

Setup a ChatDeepSeek client configuration.

@spec new!(attrs :: map()) :: t() | no_return()

Setup a ChatDeepSeek client configuration and return it or raise an error if invalid.

Restores the model from the config.

Link to this function

retry_on_fallback?(arg1)

View Source
@spec retry_on_fallback?(LangChain.LangChainError.t()) :: boolean()

Determine if an error should be retried. If true, a fallback LLM may be used. If false, the error is understood to be more fundamental with the request rather than a service issue and it should not be retried or fallback to another service.

@spec serialize_config(t()) :: %{required(String.t()) => any()}

Generate a config map that can later restore the model's configuration.