Magus.AgentChain (Magus v0.1.0)

Summary

Functions

Define a chain for agents.

Adds a JSON schema to the chain as the response format from the LLM.

Start a new AgentChain configuration.

Run the AgentChain.

Types

@type t() :: %Magus.AgentChain{
  json_response_schema: term(),
  stream_handler: term(),
  wrapped_chain: term()
}

Functions

Link to this function

%Magus.AgentChain{}

(struct)

Define a chain for agents.

This is an extention of the LangChain.Chains.LLMChain chain that provides a few helpers for the agent use case.

A stream_handler can be passed in when the AgentChain is created and then used automatically when the chain is run. This is useful for the AgentExecutor to listen to tokens as they're returned by the LLM.

The AgentChain can also be configured with a JSON schema that is used when requesting content from the LLM and then used to validate that the LLM response conforms to the schema.

Link to this function

add_message(chain, message)

@spec add_message(t(), LangChain.Message.t()) :: t()
Link to this function

add_messages(chain, messages)

@spec add_messages(t(), [LangChain.Message.t()]) :: t()
Link to this function

add_tool(chain, tool)

@spec add_tool(t(), LangChain.Function.t()) :: t()
Link to this function

ask_for_json_response(chain, schema)

@spec ask_for_json_response(t(), map()) :: t()

Adds a JSON schema to the chain as the response format from the LLM.

A message is added to the chain that tells the LLM to return the response content in JSON format. In the OpenAI case, the request will be made in JSON Mode

Link to this function

get_default_gemini_llm()

Link to this function

new!(opts \\ [])

@spec new!(opts :: keyword()) :: t()

Start a new AgentChain configuration.

Options

  • :verbose - Runs the LLM in verbose mode if set to true. Defaults to false.
  • :stream_handler - Handler that is called as the LLM returns messages.
@spec run(t()) :: {:error, binary() | list()} | {:ok, any(), LangChain.Message.t()}

Run the AgentChain.

If a stream_handler was specified when the AgentChain was created, it will be called as the LLM returns tokens.

If a JSON response was requested with ask_for_json_response, the response will be validated against the schema and decoded to a struct.