LLMAgent Architecture
View SourceThis guide provides an overview of the LLMAgent architecture, explaining how it builds on AgentForge's signal-driven design while providing LLM-specific abstractions.
Core Principles
LLMAgent is designed with the following principles in mind:
- LLM-Specific Abstractions: Create patterns optimized for LLM interactions
- Separation of Concerns: Clearly delineate LLM logic from infrastructure
- Elixir Ecosystem Integration: Leverage the strengths of Elixir/OTP
- Lightweight Implementation: Maintain a clean, minimal codebase
- Testability: Ensure components can be tested in isolation
System Architecture
LLMAgent extends AgentForge's signal-driven architecture with components specifically designed for LLM interactions:
Core Components
- Signals: Represent events in the agent lifecycle
- Handlers: Process signals and update state
- Store: Manages conversation state and history using GenServer processes, providing a centralized state management system that leverages Elixir's OTP capabilities
- Flows: Combine handlers into coherent sequences
- Providers: Interface with LLM backends
- Tools: External capabilities the agent can use
Signal Flow
The typical flow of a conversation follows these steps:
- User input is converted to a
:user_message
signal - The message handler processes the user message and generates a
:thinking
signal - The thinking handler calls the LLM and decides whether to use a tool or generate a response
- If using a tool, it generates a
:tool_call
signal - The tool handler executes the tool and generates a
:tool_result
signal - The tool result handler incorporates the result and generates a new
:thinking
signal - Eventually, the LLM generates a response, creating a
:response
signal - The response handler formats and returns the final response
Component Diagram
Here is a detailed component diagram showing the relationships between the main modules:
Dynamic Workflow Orchestration
One of LLMAgent's key strengths is its ability to support truly dynamic workflows that emerge from LLM decisions, rather than being hardcoded in advance:
Workflow Emergence
In a traditional application, the sequence of operations is predetermined by the developer. With LLMAgent, the workflow emerges dynamically:
- Context-Based Decisions: The LLM analyzes user inputs and current state to decide the next steps
- Tool Selection: Tools are chosen dynamically based on the specific needs of the current task
- Multi-Step Processing: Complex tasks are broken down into sequences of operations
- Adaptive Responses: The system adjusts based on intermediate results and user feedback
Example: Investment Portfolio Creation
The investment portfolio example demonstrates this dynamic capability:
This workflow was not predefined - it emerged from the LLM's analysis of user requests and available tools. The same architecture can support completely different workflows in other domains without changing the underlying code.
Extension Points
LLMAgent is designed to be extended in several ways:
- Custom Handlers: Create specialized handlers for domain-specific signals
- Custom Flows: Combine handlers in new ways for different interaction patterns
- Custom Tools: Add new capabilities to your agent
- Provider Plugins: Integrate with different LLM backends
Design Decisions
Why Signal-Driven Architecture?
The signal-driven architecture provides several benefits:
- Composability: Handlers can be combined in flexible ways
- Testability: Each component can be tested in isolation
- Extensibility: New signals and handlers can be added without changing existing code
- Visibility: The flow of information is explicit and traceable
Why Elixir?
Elixir's functional nature, pattern matching, and supervision trees make it ideal for building reliable, maintainable agent systems:
- Immutable State: Ensures predictable state transitions
- Pattern Matching: Makes signal handling elegant and explicit
- Concurrency: Allows handling multiple conversations efficiently
- Fault Tolerance: Supervisors can restart failed components
Next Steps
- Explore tool integration
- Learn how to create custom agents
- Understand LLM provider integration