Macula Neuroevolution

View Source

Population-based evolutionary training for neural networks.

Hex.pm Hex Docs

Overview

macula_neuroevolution is an Erlang library that provides domain-agnostic population-based evolutionary training for neural networks. It works with macula_tweann to evolve network weights through selection, crossover, and mutation.

Architecture Overview

Features

  • Population Management - Maintain and evolve populations of neural networks
  • Parallel Evaluation - Concurrent fitness evaluation using Erlang processes
  • Sexual Reproduction - Uniform crossover of parent weights
  • Mutation Operators - Weight perturbation with configurable rate/strength
  • Selection Strategies - Top-N%, tournament, roulette wheel
  • LTC Meta-Controller - Adaptive hyperparameter optimization using Liquid Time-Constant networks
  • Lineage Tracking - Track parent1_id, parent2_id, generation_born
  • Event Callbacks - Pluggable event handling for UI updates
  • Target Fitness - Automatic stopping when fitness threshold is reached

The Liquid Conglomerate Vision

This library implements the first level of a hierarchical meta-learning system called the Liquid Conglomerate:

Liquid Conglomerate

The Liquid Conglomerate is a novel architecture that uses hierarchical Liquid Time-Constant (LTC) neural networks to create a self-optimizing training system. Instead of manually tuning hyperparameters, the system learns how to learn at multiple timescales:

  • Level 0 (fast tau): Task networks react to immediate domain state
  • Level 1 (medium tau): Meta-controller adapts hyperparameters per generation
  • Level 2+ (slow tau): Higher-order controllers learn meta-strategies

Key effects on training:

  1. Self-tuning hyperparameters - Mutation rate, selection ratio adapt automatically
  2. Automatic stagnation recovery - Detects and escapes local optima
  3. Phase-appropriate strategies - Different strategies for exploration vs exploitation
  4. Transfer of meta-knowledge - Training strategies can transfer across domains

See The Liquid Conglomerate Guide for the full explanation, or LTC Meta-Controller for implementation details.

Evolution Lifecycle

Installation

Add to your rebar.config:

{deps, [
    {macula_neuroevolution, "~> 0.12.0"}
]}.

Quick Start

%% Define your evaluator module (implements neuroevolution_evaluator behaviour)
-module(my_evaluator).
-behaviour(neuroevolution_evaluator).
-export([evaluate/2]).

evaluate(Individual, Options) ->
    Network = Individual#individual.network,
    %% Run your domain-specific evaluation
    Score = run_simulation(Network),
    UpdatedIndividual = Individual#individual{
        metrics = #{total_score => Score}
    },
    {ok, UpdatedIndividual}.

%% Start training
Config = #neuro_config{
    population_size = 50,
    selection_ratio = 0.20,
    mutation_rate = 0.10,
    mutation_strength = 0.3,
    network_topology = {42, [16, 8], 6},  % 42 inputs, 2 hidden layers, 6 outputs
    evaluator_module = my_evaluator
},

{ok, Pid} = neuroevolution_server:start_link(Config),
neuroevolution_server:start_training(Pid).

Configuration

ParameterDefaultDescription
population_size50Number of individuals
evaluations_per_individual10Games/tests per individual per generation
selection_ratio0.20Fraction of population that survives (top 20%)
mutation_rate0.10Probability of mutating each weight
mutation_strength0.3Magnitude of weight perturbation
max_generationsinfinityMaximum generations to run
network_topology-{InputSize, HiddenLayers, OutputSize}
evaluator_module-Module implementing neuroevolution_evaluator
evaluator_options#{}Options passed to evaluator
event_handlerundefined{Module, InitArg} for event notifications

Event Handling

Subscribe to training events by providing an event handler:

-module(my_event_handler).
-export([handle_event/2]).

handle_event({generation_started, Gen}, _State) ->
    io:format("Generation ~p started~n", [Gen]);
handle_event({generation_complete, Stats}, _State) ->
    io:format("Generation ~p: Best=~.2f, Avg=~.2f~n",
              [Stats#generation_stats.generation,
               Stats#generation_stats.best_fitness,
               Stats#generation_stats.avg_fitness]);
handle_event(_Event, _State) ->
    ok.

%% Configure with event handler
Config = #neuro_config{
    %% ... other options ...
    event_handler = {my_event_handler, undefined}
}.

Custom Evaluators

Implement the neuroevolution_evaluator behaviour:

-module(snake_game_evaluator).
-behaviour(neuroevolution_evaluator).
-export([evaluate/2, calculate_fitness/1]).

%% Required callback
evaluate(Individual, Options) ->
    Network = Individual#individual.network,
    NumGames = maps:get(games, Options, 10),

    %% Play multiple games and aggregate results
    Results = [play_game(Network) || _ <- lists:seq(1, NumGames)],

    TotalScore = lists:sum([R#result.score || R <- Results]),
    TotalTicks = lists:sum([R#result.ticks || R <- Results]),
    Wins = length([R || R <- Results, R#result.won]),

    UpdatedIndividual = Individual#individual{
        metrics = #{
            total_score => TotalScore,
            total_ticks => TotalTicks,
            wins => Wins
        }
    },
    {ok, UpdatedIndividual}.

%% Optional callback for custom fitness calculation
calculate_fitness(Metrics) ->
    Score = maps:get(total_score, Metrics, 0),
    Ticks = maps:get(total_ticks, Metrics, 0),
    Wins = maps:get(wins, Metrics, 0),
    Score * 50.0 + Ticks / 50.0 + Wins * 2.0.

Building

rebar3 compile
rebar3 eunit
rebar3 dialyzer

Academic References

Evolutionary Algorithms

  • Holland, J.H. (1975). Adaptation in Natural and Artificial Systems. MIT Press.

    • Foundational text on genetic algorithms and evolutionary computation.
  • Goldberg, D.E. (1989). Genetic Algorithms in Search, Optimization, and Machine Learning. Addison-Wesley.

    • Comprehensive coverage of genetic algorithm theory and practice.

Neuroevolution

Selection & Breeding

Fitness Evaluation

Macula Ecosystem

  • macula_tweann - Neural network library with topology evolution, LTC neurons, and ONNX export. Core dependency for network creation and evaluation.

  • macula - HTTP/3 mesh networking platform enabling distributed neuroevolution across edge devices with NAT traversal.

  • DXNN2 - Gene Sher's original TWEANN implementation in Erlang.

  • NEAT-Python - Popular Python NEAT implementation with extensive documentation.

  • OpenAI ES - OpenAI's evolution strategies implementation for RL.

  • EvoTorch - Modern PyTorch-based evolutionary algorithm library.

  • DEAP - Distributed Evolutionary Algorithms in Python.

Guides

Getting Started

Advanced Topics

License

Apache License 2.0