View Source LangChain.ChatModels.ChatOpenAI (LangChain v0.2.0)

Represents the OpenAI ChatModel.

Parses and validates inputs for making a requests from the OpenAI Chat API.

Converts responses into more specialized LangChain data structures.

Summary

Functions

Calls the OpenAI API passing the ChatOpenAI struct with configuration, plus either a simple message or the list of messages to act as the prompt.

Decode a streamed response from an OpenAI-compatible server. Parses a string of received content into an Elixir map data structure using string keys.

Convert a LangChain structure to the expected map of data for the OpenAI API.

Return the params formatted for an API request.

Setup a ChatOpenAI client configuration.

Setup a ChatOpenAI client configuration and return it or raise an error if invalid.

Types

@type t() :: %LangChain.ChatModels.ChatOpenAI{
  api_key: term(),
  endpoint: term(),
  frequency_penalty: term(),
  json_response: term(),
  max_tokens: term(),
  model: term(),
  n: term(),
  receive_timeout: term(),
  seed: term(),
  stream: term(),
  temperature: term(),
  user: term()
}

Functions

Link to this function

call(openai, prompt, tools \\ [], callback_fn \\ nil)

View Source

Calls the OpenAI API passing the ChatOpenAI struct with configuration, plus either a simple message or the list of messages to act as the prompt.

Optionally pass in a list of tools available to the LLM for requesting execution in response.

Optionally pass in a callback function that can be executed as data is received from the API.

NOTE: This function can be used directly, but the primary interface should be through LangChain.Chains.LLMChain. The ChatOpenAI module is more focused on translating the LangChain data structures to and from the OpenAI API.

Another benefit of using LangChain.Chains.LLMChain is that it combines the storage of messages, adding tools, adding custom context that should be passed to tools, and automatically applying LangChain.MessageDelta structs as they are are received, then converting those to the full LangChain.Message once fully complete.

@spec decode_stream({String.t(), String.t()}) :: {%{required(String.t()) => any()}}

Decode a streamed response from an OpenAI-compatible server. Parses a string of received content into an Elixir map data structure using string keys.

If a partial response was received, meaning the JSON text is split across multiple data frames, then the incomplete portion is returned as-is in the buffer. The function will be successively called, receiving the incomplete buffer data from a previous call, and assembling it to parse.

@spec for_api(
  LangChain.Message.t()
  | LangChain.Message.ContentPart.t()
  | LangChain.Function.t()
) ::
  %{required(String.t()) => any()} | [%{required(String.t()) => any()}]

Convert a LangChain structure to the expected map of data for the OpenAI API.

Link to this function

for_api(openai, messages, tools)

View Source
@spec for_api(
  t() | LangChain.Message.t() | LangChain.Function.t(),
  message :: [map()],
  LangChain.ChatModels.ChatModel.tools()
) :: %{required(atom()) => any()}

Return the params formatted for an API request.

@spec new(attrs :: map()) :: {:ok, t()} | {:error, Ecto.Changeset.t()}

Setup a ChatOpenAI client configuration.

@spec new!(attrs :: map()) :: t() | no_return()

Setup a ChatOpenAI client configuration and return it or raise an error if invalid.