View Source LangChain.Chains.LLMChain (LangChain v0.2.0)

Summary

Functions

Define an LLMChain. This is the heart of the LangChain library.

Add a received Message struct to the chain. The LLMChain tracks the last_message received and the complete list of messages exchanged. Depending on the message role, the chain may be in a pending or incomplete state where a response from the LLM is anticipated.

Add a set of Message structs to the chain. This enables quickly building a chain for submitting to an LLM.

Add a tool to an LLMChain.

Apply a received MessageDelta struct to the chain. The LLMChain tracks the current merged MessageDelta state. When the final delta is received that completes the message, the LLMChain is updated to clear the delta and the last_message and list of messages are updated.

Apply a list of deltas to the chain.

Apply a set of PromptTemplates to the chain. The list of templates can also include Messages with no templates. Provide the inputs to apply to the templates for rendering as a message. The prepared messages are applied to the chain.

Remove an incomplete MessageDelta from delta and add a Message with the desired status to the chain.

Convert any hanging delta of the chain to a message and append to the chain.

Execute the tool call with the tool. Returns the tool's message response.

If the last_message from the Assistant includes one or more ToolCalls, then the linked tool is executed. If there is no last_message or the last_message is not a tool_call, the LLMChain is returned with no action performed. This makes it safe to call any time.

Start a new LLMChain configuration.

Start a new LLMChain configuration and return it or raise an error if invalid.

Convenience function for setting the prompt text for the LLMChain using prepared text.

Run the chain on the LLM using messages and any registered functions. This formats the request for a ChatLLMChain where messages are passed to the API.

Update the LLMChain's custom_context map. Passing in a context_update map will by default merge the map into the existing custom_context.

Types

@type t() :: %LangChain.Chains.LLMChain{
  _tool_map: term(),
  callback_fn: term(),
  custom_context: term(),
  delta: term(),
  last_message: term(),
  llm: term(),
  messages: term(),
  needs_response: term(),
  tools: term(),
  verbose: term(),
  verbose_deltas: term()
}

Functions

Link to this function

%LangChain.Chains.LLMChain{}

View Source (struct)

Define an LLMChain. This is the heart of the LangChain library.

The chain deals with tools, a tool map, delta tracking, last_message tracking, conversation messages, and verbose logging. This helps by separating these responsibilities from the LLM making it easier to support additional LLMs because the focus is on communication and formats instead of all the extra logic.

Link to this function

add_message(chain, new_message)

View Source
@spec add_message(t(), LangChain.Message.t()) :: t()

Add a received Message struct to the chain. The LLMChain tracks the last_message received and the complete list of messages exchanged. Depending on the message role, the chain may be in a pending or incomplete state where a response from the LLM is anticipated.

Link to this function

add_messages(chain, messages)

View Source
@spec add_messages(t(), [LangChain.Message.t()]) :: t()

Add a set of Message structs to the chain. This enables quickly building a chain for submitting to an LLM.

@spec add_tools(t(), LangChain.Function.t() | [LangChain.Function.t()]) ::
  t() | no_return()

Add a tool to an LLMChain.

Link to this function

apply_delta(chain, new_delta)

View Source
@spec apply_delta(t(), LangChain.MessageDelta.t()) :: t()

Apply a received MessageDelta struct to the chain. The LLMChain tracks the current merged MessageDelta state. When the final delta is received that completes the message, the LLMChain is updated to clear the delta and the last_message and list of messages are updated.

Link to this function

apply_deltas(chain, deltas)

View Source
@spec apply_deltas(t(), list()) :: t()

Apply a list of deltas to the chain.

Link to this function

apply_prompt_templates(chain, templates, inputs)

View Source
@spec apply_prompt_templates(
  t(),
  [LangChain.Message.t() | LangChain.PromptTemplate.t()],
  %{
    required(atom()) => any()
  }
) :: t() | no_return()

Apply a set of PromptTemplates to the chain. The list of templates can also include Messages with no templates. Provide the inputs to apply to the templates for rendering as a message. The prepared messages are applied to the chain.

Link to this function

cancel_delta(chain, message_status)

View Source

Remove an incomplete MessageDelta from delta and add a Message with the desired status to the chain.

Link to this function

common_validation(changeset)

View Source
Link to this function

delta_to_message_when_complete(chain)

View Source
@spec delta_to_message_when_complete(t()) :: t()

Convert any hanging delta of the chain to a message and append to the chain.

If the delta is nil, the chain is returned unmodified.

Link to this function

execute_tool_call(call, function, opts \\ [])

View Source

Execute the tool call with the tool. Returns the tool's message response.

Link to this function

execute_tool_calls(chain, context \\ nil)

View Source
@spec execute_tool_calls(t(), context :: nil | %{required(atom()) => any()}) :: t()

If the last_message from the Assistant includes one or more ToolCalls, then the linked tool is executed. If there is no last_message or the last_message is not a tool_call, the LLMChain is returned with no action performed. This makes it safe to call any time.

The context is additional data that will be passed to the executed tool. The value given here will override any custom_context set on the LLMChain. If not set, the global custom_context is used.

@spec new(attrs :: map()) :: {:ok, t()} | {:error, Ecto.Changeset.t()}

Start a new LLMChain configuration.

{:ok, chain} = LLMChain.new(%{
  llm: %ChatOpenAI{model: "gpt-3.5-turbo", stream: true},
  messages: [%Message.new_system!("You are a helpful assistant.")]
})
@spec new!(attrs :: map()) :: t() | no_return()

Start a new LLMChain configuration and return it or raise an error if invalid.

chain = LLMChain.new!(%{
  llm: %ChatOpenAI{model: "gpt-3.5-turbo", stream: true},
  messages: [%Message.new_system!("You are a helpful assistant.")]
})
Link to this function

quick_prompt(chain, text)

View Source
@spec quick_prompt(t(), String.t()) :: t()

Convenience function for setting the prompt text for the LLMChain using prepared text.

@spec run(t(), Keyword.t()) ::
  {:ok, t(), LangChain.Message.t() | [LangChain.Message.t()]}
  | {:error, String.t()}

Run the chain on the LLM using messages and any registered functions. This formats the request for a ChatLLMChain where messages are passed to the API.

When successful, it returns {:ok, updated_chain, message_or_messages}

Options

  • :while_needs_response - repeatedly evaluates functions and submits to the LLM so long as we still expect to get a response.
  • :callback_fn - the callback function to execute as messages are received.

The callback_fn is a function that receives one argument. It is the LangChain structure for the received message or event. It may be a MessageDelta or a Message. Use pattern matching to respond as desired.

Link to this function

update_custom_context(chain, context_update, opts \\ [])

View Source
@spec update_custom_context(
  t(),
  context_update :: %{required(atom()) => any()},
  opts :: Keyword.t()
) ::
  t() | no_return()

Update the LLMChain's custom_context map. Passing in a context_update map will by default merge the map into the existing custom_context.

Use the :as option to:

  • :merge - Merge update changes in. Default.
  • :replace - Replace the context with the context_update.