View Source LangChain.Chains.LLMChain (LangChain v0.2.0)
Summary
Functions
Define an LLMChain. This is the heart of the LangChain library.
Add a received Message struct to the chain. The LLMChain tracks the
last_message
received and the complete list of messages exchanged. Depending
on the message role, the chain may be in a pending or incomplete state where
a response from the LLM is anticipated.
Add a set of Message structs to the chain. This enables quickly building a chain for submitting to an LLM.
Add a tool to an LLMChain.
Apply a received MessageDelta struct to the chain. The LLMChain tracks the
current merged MessageDelta state. When the final delta is received that
completes the message, the LLMChain is updated to clear the delta
and the
last_message
and list of messages are updated.
Apply a list of deltas to the chain.
Apply a set of PromptTemplates to the chain. The list of templates can also include Messages with no templates. Provide the inputs to apply to the templates for rendering as a message. The prepared messages are applied to the chain.
Remove an incomplete MessageDelta from delta
and add a Message with the
desired status to the chain.
Convert any hanging delta of the chain to a message and append to the chain.
Execute the tool call with the tool. Returns the tool's message response.
If the last_message
from the Assistant includes one or more ToolCall
s, then the linked
tool is executed. If there is no last_message
or the last_message
is
not a tool_call
, the LLMChain is returned with no action performed.
This makes it safe to call any time.
Start a new LLMChain configuration.
Start a new LLMChain configuration and return it or raise an error if invalid.
Convenience function for setting the prompt text for the LLMChain using prepared text.
Run the chain on the LLM using messages and any registered functions. This formats the request for a ChatLLMChain where messages are passed to the API.
Update the LLMChain's custom_context
map. Passing in a context_update
map
will by default merge the map into the existing custom_context
.
Types
Functions
Define an LLMChain. This is the heart of the LangChain library.
The chain deals with tools, a tool map, delta tracking, last_message tracking, conversation messages, and verbose logging. This helps by separating these responsibilities from the LLM making it easier to support additional LLMs because the focus is on communication and formats instead of all the extra logic.
@spec add_message(t(), LangChain.Message.t()) :: t()
Add a received Message struct to the chain. The LLMChain tracks the
last_message
received and the complete list of messages exchanged. Depending
on the message role, the chain may be in a pending or incomplete state where
a response from the LLM is anticipated.
@spec add_messages(t(), [LangChain.Message.t()]) :: t()
Add a set of Message structs to the chain. This enables quickly building a chain for submitting to an LLM.
@spec add_tools(t(), LangChain.Function.t() | [LangChain.Function.t()]) :: t() | no_return()
Add a tool to an LLMChain.
@spec apply_delta(t(), LangChain.MessageDelta.t()) :: t()
Apply a received MessageDelta struct to the chain. The LLMChain tracks the
current merged MessageDelta state. When the final delta is received that
completes the message, the LLMChain is updated to clear the delta
and the
last_message
and list of messages are updated.
Apply a list of deltas to the chain.
@spec apply_prompt_templates( t(), [LangChain.Message.t() | LangChain.PromptTemplate.t()], %{ required(atom()) => any() } ) :: t() | no_return()
Apply a set of PromptTemplates to the chain. The list of templates can also include Messages with no templates. Provide the inputs to apply to the templates for rendering as a message. The prepared messages are applied to the chain.
Remove an incomplete MessageDelta from delta
and add a Message with the
desired status to the chain.
Convert any hanging delta of the chain to a message and append to the chain.
If the delta is nil
, the chain is returned unmodified.
@spec execute_tool_call( LangChain.Message.ToolCall.t(), LangChain.Function.t(), Keyword.t() ) :: LangChain.Message.ToolResult.t()
Execute the tool call with the tool. Returns the tool's message response.
If the last_message
from the Assistant includes one or more ToolCall
s, then the linked
tool is executed. If there is no last_message
or the last_message
is
not a tool_call
, the LLMChain is returned with no action performed.
This makes it safe to call any time.
The context
is additional data that will be passed to the executed tool.
The value given here will override any custom_context
set on the LLMChain.
If not set, the global custom_context
is used.
@spec new(attrs :: map()) :: {:ok, t()} | {:error, Ecto.Changeset.t()}
Start a new LLMChain configuration.
{:ok, chain} = LLMChain.new(%{
llm: %ChatOpenAI{model: "gpt-3.5-turbo", stream: true},
messages: [%Message.new_system!("You are a helpful assistant.")]
})
Start a new LLMChain configuration and return it or raise an error if invalid.
chain = LLMChain.new!(%{
llm: %ChatOpenAI{model: "gpt-3.5-turbo", stream: true},
messages: [%Message.new_system!("You are a helpful assistant.")]
})
Convenience function for setting the prompt text for the LLMChain using prepared text.
@spec run(t(), Keyword.t()) :: {:ok, t(), LangChain.Message.t() | [LangChain.Message.t()]} | {:error, String.t()}
Run the chain on the LLM using messages and any registered functions. This formats the request for a ChatLLMChain where messages are passed to the API.
When successful, it returns {:ok, updated_chain, message_or_messages}
Options
:while_needs_response
- repeatedly evaluates functions and submits to the LLM so long as we still expect to get a response.:callback_fn
- the callback function to execute as messages are received.
The callback_fn
is a function that receives one argument. It is the
LangChain structure for the received message or event. It may be a
MessageDelta
or a Message
. Use pattern matching to respond as desired.
@spec update_custom_context( t(), context_update :: %{required(atom()) => any()}, opts :: Keyword.t() ) :: t() | no_return()
Update the LLMChain's custom_context
map. Passing in a context_update
map
will by default merge the map into the existing custom_context
.
Use the :as
option to:
:merge
- Merge update changes in. Default.:replace
- Replace the context with thecontext_update
.