Luagents

View Source

Continuous Integration Module Version Hex Docs Total Download License Last Updated

A ReAct (Reasoning and Acting) agent implementation in Elixir that thinks using Lua code. Inspired by smolagents.

Overview

Luagents implements a ReAct loop where:

  1. The agent receives a task from the user
  2. It uses an LLM (Large Language Model) to generate Lua code to think through the problem step-by-step
  3. The Lua code can call tools and reason about results
  4. The agent continues until it finds a final answer

Supported LLM Providers

Luagents supports multiple LLM providers through a pluggable architecture:

  • Anthropic Claude (via Anthropix) - Cloud-based, high-quality responses
  • Ollama - Local models, privacy-focused, customizable

Important Note: This agent's performance is directly tied to the quality of the underlying language model. Using a model with strong reasoning and coding capabilities is essential for reliable results.

Installation

Add luagents to your dependencies in mix.exs:

def deps do
  [
    {:luagents, "~> 0.1.0"}
  ]
end

Configuration

Anthropic Claude

Set your Anthropic API key using one of these methods:

  1. Environment variable (recommended):

    export ANTHROPIC_API_KEY=your-api-key
    
  2. Pass directly when creating an agent:

    agent = Luagents.Agent.new(
    llm: Luagents.LLM.new(provider: :anthropic, api_key: "your-api-key")
    )

Ollama

  1. Install and run Ollama
  2. Pull a model (e.g., ollama pull mistral)
  3. Create an agent with Ollama provider:
agent = Luagents.Agent.new(
  llm: Luagents.LLM.new(provider: :ollama, model: "mistral")
)

Usage

Basic Usage

# Create an agent with Anthropic Claude (default)
agent = Luagents.Agent.new()

# Or specify the provider explicitly  
agent = Luagents.Agent.new(
  llm: Luagents.LLM.new(provider: :anthropic)
)

# Use Ollama instead
agent = Luagents.Agent.new(
  llm: Luagents.LLM.new(provider: :ollama, model: "mistral")
)

# Run a task
{:ok, result} = Luagents.Agent.run(agent, "What is 2 + 2?")

Advanced Usage

anthropic_agent = Luagents.Agent.new(
  name: "Claude Agent",
  llm: Luagents.LLM.new(provider: :anthropic, model: "claude-3-sonnet-20240229")
)

ollama_agent = Luagents.Agent.new(
  name: "Ollama Agent", 
  llm: Luagents.LLM.new(provider: :ollama, model: "mistral", host: "http://localhost:11434")
)

agent = Luagents.Agent.new(
  name: "MyAgent",
  max_iterations: 10,
  llm: Luagents.LLM.new(
    model: "claude-3-opus-20240229",
    temperature: 0.5
  )
)

{:ok, result} = Luagents.run_with_agent(agent, "Hello!")

Example Lua Thinking

The agent thinks in Lua code like this:

thought("I need to perform a calculation")

local a = 10
local b = 20
local sum = add(a, b)

observation("The sum is " .. sum)

local result = multiply(sum, 2)
thought("Multiplying by 2 gives " .. result)

final_answer("The result is " .. result)

Contributing

Before opening a pull request, please open an issue first.

git clone https://github.com/doomspork/luagents.git
cd luagents 
mix deps.get
mix test

Once you've made your additions and mix test passes, go ahead and open a PR!

License

Luagents is Copyright © 2025 doomspork. It is free software, and may be redistributed under the terms specified in the LICENSE file.