View Source LangChain.Chains.LLMChain (LangChain v0.1.0)
Summary
Functions
Define an LLMChain. This is the heart of the LangChain library.
Add more functions to an LLMChain.
Add a received Message struct to the chain. The LLMChain tracks the
last_message
received and the complete list of messages exchanged. Depending
on the message role, the chain may be in a pending or incomplete state where
a response from the LLM is anticipated.
Add a set of Message structs to the chain. This enables quickly building a chain for submitting to an LLM.
Apply a received MessageDelta struct to the chain. The LLMChain tracks the
current merged MessageDelta state. When the final delta is received that
completes the message, the LLMChain is updated to clear the delta
and the
last_message
and list of messages are updated.
Apply a list of deltas to the chain.
Apply a set of PromptTemplates to the chain. The list of templates can also include Messages with no templates. Provide the inputs to apply to the templates for rendering as a message. The prepared messages are applied to the chain.
Remove an incomplete MessageDelta from delta
and add a Message with the
desired status to the chain.
If the last_message
is a %Message{role: :function_call}
, then the linked
function is executed. If there is no last_message
or the last_message
is
not a :function_call
, the LLMChain is returned with no action performed.
This makes it safe to call any time.
Start a new LLMChain configuration.
Start a new LLMChain configuration and return it or raise an error if invalid.
Convenience function for setting the prompt text for the LLMChain using prepared text.
Run the chain on the LLM using messages and any registered functions. This formats the request for a ChatLLMChain where messages are passed to the API.
Types
Functions
Define an LLMChain. This is the heart of the LangChain library.
The chain deals with functions, a function map, delta tracking, last_message tracking, conversation messages, and verbose logging. This helps by separating these responsibilities from the LLM making it easier to support additional LLMs because the focus is on communication and formats instead of all the extra logic.
@spec add_functions(t(), LangChain.Function.t() | [LangChain.Function.t()]) :: t() | no_return()
Add more functions to an LLMChain.
@spec add_message(t(), LangChain.Message.t()) :: t()
Add a received Message struct to the chain. The LLMChain tracks the
last_message
received and the complete list of messages exchanged. Depending
on the message role, the chain may be in a pending or incomplete state where
a response from the LLM is anticipated.
@spec add_messages(t(), [LangChain.Message.t()]) :: t()
Add a set of Message structs to the chain. This enables quickly building a chain for submitting to an LLM.
@spec apply_delta(t(), LangChain.MessageDelta.t()) :: t()
Apply a received MessageDelta struct to the chain. The LLMChain tracks the
current merged MessageDelta state. When the final delta is received that
completes the message, the LLMChain is updated to clear the delta
and the
last_message
and list of messages are updated.
Apply a list of deltas to the chain.
@spec apply_prompt_templates( t(), [LangChain.Message.t() | LangChain.PromptTemplate.t()], %{ required(atom()) => any() } ) :: t() | no_return()
Apply a set of PromptTemplates to the chain. The list of templates can also include Messages with no templates. Provide the inputs to apply to the templates for rendering as a message. The prepared messages are applied to the chain.
Remove an incomplete MessageDelta from delta
and add a Message with the
desired status to the chain.
If the last_message
is a %Message{role: :function_call}
, then the linked
function is executed. If there is no last_message
or the last_message
is
not a :function_call
, the LLMChain is returned with no action performed.
This makes it safe to call any time.
The context
is additional data that will be passed to the executed function.
The value given here will override any custom_context
set on the LLMChain.
If not set, the global custom_context
is used.
https://platform.openai.com/docs/guides/gpt/function-calling
@spec new(attrs :: map()) :: {:ok, t()} | {:error, Ecto.Changeset.t()}
Start a new LLMChain configuration.
Start a new LLMChain configuration and return it or raise an error if invalid.
Convenience function for setting the prompt text for the LLMChain using prepared text.
@spec run(t(), Keyword.t()) :: {:ok, t(), LangChain.Message.t() | [LangChain.Message.t()]} | {:error, String.t()}
Run the chain on the LLM using messages and any registered functions. This formats the request for a ChatLLMChain where messages are passed to the API.
When successful, it returns {:ok, updated_chain, message_or_messages}
Options
:while_needs_response
- repeatedly evaluates functions and submits to the LLM so long as we still expect to get a response.:callback_fn
- the callback function to execute as messages are received.