View Source LangChain.ChatModels.ChatGrok (LangChain v0.4.0-rc.2)

Module for interacting with xAI's Grok models.

Parses and validates inputs for making requests to xAI's chat completions API.

Converts responses into more specialized LangChain data structures.

Tested with models

  • grok-4 - The latest and most advanced reasoning model with 130K+ context window
  • grok-3-mini - Faster, lightweight model optimized for speed and efficiency and other, please look in the tests --include live_grok for more.

OpenAI API Compatibility

Grok's API is fully compatible with OpenAI's format, making integration straightforward. The main differences are:

  • Base URL: https://api.x.ai/v1/chat/completions
  • Model names: grok-4, grok-3-mini, etc.
  • Enhanced context window and reasoning capabilities

Usage Example

# Basic usage with Grok-4
{:ok, chat} = ChatGrok.new(%{
  model: "grok-4",
  temperature: 0.7,
  max_tokens: 1000
})

# Fast and efficient with Grok-3-mini
{:ok, grok_mini} = ChatGrok.new(%{
  model: "grok-3-mini",
  temperature: 0.8,
  max_tokens: 5000,
  api_key: System.get_env("XAI_API_KEY"),
  callbacks: [handlers]
})

Callbacks

See the set of available callbacks: LangChain.Chains.ChainCallbacks

Rate Limit API Response Headers

xAI returns rate limit information in the response headers. Those can be accessed using the LLM callback on_llm_ratelimit_info like this:

handlers = %{
  on_llm_ratelimit_info: fn _model, headers ->
    IO.inspect(headers, label: )
  end
}

{:ok, grok_mini} = ChatGrok.new(%{callbacks: [handlers]})

Token Usage

xAI returns token usage information as part of the response body. The LangChain.TokenUsage is added to the metadata of the LangChain.Message and LangChain.MessageDelta structs that are processed under the :usage key.

Tool Choice

Grok supports forcing a tool to be used, following OpenAI's format:

ChatGrok.new(%{
  model: "grok-4",
  tool_choice: %{"type" => "function", "function" => %{"name" => "get_weather"}}
})

Summary

Functions

Calls the xAI API with the given messages and tools.

Return the params formatted for an API request.

Convert a LangChain structure to the expected xAI API format.

Setup a ChatGrok client configuration.

Setup a ChatGrok client configuration and return it or raise an error if invalid.

Restore a ChatGrok struct from a serialized configuration map.

Serialize the configuration of a ChatGrok struct to a map for saving.

Types

@type t() :: %LangChain.ChatModels.ChatGrok{
  api_key: term(),
  callbacks: term(),
  endpoint: term(),
  frequency_penalty: term(),
  large_context: term(),
  max_tokens: term(),
  model: term(),
  multi_agent: term(),
  n: term(),
  presence_penalty: term(),
  reasoning_mode: term(),
  receive_timeout: term(),
  response_format: term(),
  seed: term(),
  stream: term(),
  stream_options: term(),
  temperature: term(),
  tool_choice: term(),
  top_p: term(),
  verbose_api: term()
}

Functions

Link to this function

call(grok, prompt, tools \\ [])

View Source

Calls the xAI API with the given messages and tools.

Link to this function

for_api(grok, messages, tools \\ [])

View Source

Return the params formatted for an API request.

Link to this function

for_api_message(message)

View Source

Convert a LangChain structure to the expected xAI API format.

@spec new(attrs :: map()) :: {:ok, t()} | {:error, Ecto.Changeset.t()}

Setup a ChatGrok client configuration.

@spec new!(attrs :: map()) :: t() | no_return()

Setup a ChatGrok client configuration and return it or raise an error if invalid.

Restore a ChatGrok struct from a serialized configuration map.

Serialize the configuration of a ChatGrok struct to a map for saving.