View Source LangChain.ChatModels.ChatDeepSeek (LangChain v0.4.1)
Module for interacting with DeepSeek models.
DeepSeek provides an API that is compatible with OpenAI's API format, making it easy to integrate with existing OpenAI-based code.
Model Options
DeepSeek supports the following models:
deepseek-chat- Non-thinking mode of DeepSeek-V3.2-Expdeepseek-reasoner- Thinking mode of DeepSeek-V3.2-Exp
API Configuration
The DeepSeek API uses the following configuration:
- Base URL:
https://api.deepseek.com(orhttps://api.deepseek.com/v1for OpenAI compatibility) - Authentication: Bearer token (API key)
Example Usage
# Basic usage
model = ChatDeepSeek.new!(%{
model: "deepseek-chat",
api_key: "your-api-key-here"
})
# Using with LLMChain
{:ok, chain} =
LLMChain.new!(%{llm: model})
|> LLMChain.add_message(Message.new_user!("Hello!"))
{:ok, response} = LLMChain.run(chain)Tool Support
DeepSeek supports function calling through the OpenAI-compatible API format. You can use tools in the same way as with OpenAI:
model = ChatDeepSeek.new!(%{
model: "deepseek-chat",
api_key: "your-api-key-here"
})
function = Function.new!(%{
name: "get_weather",
description: "Get current weather for a location",
parameters_schema: %{
"type" => "object",
"properties" => %{
"location" => %{
"type" => "string",
"description" => "The city and state, e.g. San Francisco, CA"
}
},
"required" => ["location"]
}
})Callbacks
See the set of available callbacks: LangChain.Chains.ChainCallbacks
Token Usage
DeepSeek returns token usage information as part of the response body. The
LangChain.TokenUsage is added to the metadata of the LangChain.Message
and LangChain.MessageDelta structs that are processed under the :usage
key.
The TokenUsage data is accumulated for MessageDelta structs and the final usage information will be on the LangChain.Message.
Summary
Functions
Calls the DeepSeek API passing the ChatDeepSeek struct with configuration, plus either a simple message or the list of messages to act as the prompt.
Decode a streamed response from a DeepSeek server. This is the same as the OpenAI implementation since DeepSeek uses an OpenAI-compatible API.
Convert a LangChain Message-based structure to the expected map of data for the DeepSeek API.
Return the params formatted for an API request.
Setup a ChatDeepSeek client configuration.
Setup a ChatDeepSeek client configuration and return it or raise an error if invalid.
Restores the model from the config.
Determine if an error should be retried. If true, a fallback LLM may be
used. If false, the error is understood to be more fundamental with the
request rather than a service issue and it should not be retried or fallback
to another service.
Generate a config map that can later restore the model's configuration.
Types
@type t() :: %LangChain.ChatModels.ChatDeepSeek{ api_key: term(), callbacks: term(), endpoint: term(), frequency_penalty: term(), json_response: term(), json_schema: term(), logprobs: term(), max_tokens: term(), model: term(), n: term(), parallel_tool_calls: term(), receive_timeout: term(), req_config: term(), seed: term(), stream: term(), stream_options: term(), temperature: term(), tool_choice: term(), top_logprobs: term(), user: term(), verbose_api: term() }
Functions
Calls the DeepSeek API passing the ChatDeepSeek struct with configuration, plus either a simple message or the list of messages to act as the prompt.
Optionally pass in a list of tools available to the LLM for requesting execution in response.
Optionally pass in a callback function that can be executed as data is received from the API.
NOTE: This function can be used directly, but the primary interface
should be through LangChain.Chains.LLMChain. The ChatDeepSeek module is more
focused on translating the LangChain data structures to and from the DeepSeek
API.
Another benefit of using LangChain.Chains.LLMChain is that it combines the
storage of messages, adding tools, adding custom context that should be
passed to tools, and automatically applying LangChain.MessageDelta
structs as they are are received, then converting those to the full
LangChain.Message once fully complete.
@spec decode_stream({String.t(), String.t()}, list(), non_neg_integer()) :: {%{required(String.t()) => any()}} | {:error, LangChain.LangChainError.t()}
Decode a streamed response from a DeepSeek server. This is the same as the OpenAI implementation since DeepSeek uses an OpenAI-compatible API.
@spec for_api( struct(), LangChain.Message.t() | LangChain.PromptTemplate.t() | LangChain.Message.ToolCall.t() | LangChain.Message.ToolResult.t() | LangChain.Message.ContentPart.t() | LangChain.Function.t() ) :: %{required(String.t()) => any()} | [%{required(String.t()) => any()}]
Convert a LangChain Message-based structure to the expected map of data for the DeepSeek API.
@spec for_api( t() | LangChain.Message.t() | LangChain.Function.t(), message :: [map()], LangChain.ChatModels.ChatModel.tools() ) :: %{required(atom()) => any()}
Return the params formatted for an API request.
@spec new(attrs :: map()) :: {:ok, t()} | {:error, Ecto.Changeset.t()}
Setup a ChatDeepSeek client configuration.
Setup a ChatDeepSeek client configuration and return it or raise an error if invalid.
Restores the model from the config.
@spec retry_on_fallback?(LangChain.LangChainError.t()) :: boolean()
Determine if an error should be retried. If true, a fallback LLM may be
used. If false, the error is understood to be more fundamental with the
request rather than a service issue and it should not be retried or fallback
to another service.
Generate a config map that can later restore the model's configuration.