LangChain.ChatModels.ChatMistralAI (LangChain v0.6.0)

Copy Markdown View Source

Summary

Functions

Calls the Mistral API passing the ChatMistralAI struct plus either a simple string prompt or a list of messages as the prompt. Optionally pass in a list of tools.

Converts a LangChain Message-based structure into the expected map of data for Mistral. We also include any tool_calls stored on the message.

Formats this struct plus the given messages and tools as a request payload.

Restores the model from the config map.

Determine if an error should be retried. If true, a fallback LLM may be used. If false, the error is understood to be more fundamental with the request rather than a service issue and it should not be retried or fallback to another service.

Generate a config map that can later restore the model's configuration.

Types

t()

@type t() :: %LangChain.ChatModels.ChatMistralAI{
  api_key: term(),
  callbacks: term(),
  endpoint: term(),
  json_response: term(),
  json_schema: term(),
  max_tokens: term(),
  model: term(),
  parallel_tool_calls: term(),
  random_seed: term(),
  receive_timeout: term(),
  safe_prompt: term(),
  stream: term(),
  temperature: term(),
  tool_choice: term(),
  top_p: term(),
  verbose_api: term()
}

Functions

call(mistralai, prompt, tools)

Calls the Mistral API passing the ChatMistralAI struct plus either a simple string prompt or a list of messages as the prompt. Optionally pass in a list of tools.

for_api(model, msg)

Converts a LangChain Message-based structure into the expected map of data for Mistral. We also include any tool_calls stored on the message.

for_api(mistral, messages, tools)

@spec for_api(t(), [LangChain.Message.t()], LangChain.ChatModels.ChatModel.tools()) ::
  %{
    required(atom()) => any()
  }

Formats this struct plus the given messages and tools as a request payload.

new(attrs \\ %{})

@spec new(attrs :: map()) :: {:ok, t()} | {:error, Ecto.Changeset.t()}

new!(attrs \\ %{})

@spec new!(attrs :: map()) :: t() | no_return()

restore_from_map(data)

Restores the model from the config map.

retry_on_fallback?(arg1)

@spec retry_on_fallback?(LangChain.LangChainError.t()) :: boolean()

Determine if an error should be retried. If true, a fallback LLM may be used. If false, the error is understood to be more fundamental with the request rather than a service issue and it should not be retried or fallback to another service.

serialize_config(model)

@spec serialize_config(t()) :: %{required(String.t()) => any()}

Generate a config map that can later restore the model's configuration.