Macula Neuroevolution
View SourcePopulation-based evolutionary training for neural networks.
Overview
macula_neuroevolution is an Erlang library that provides domain-agnostic population-based evolutionary training for neural networks. It works with macula_tweann to evolve network weights through selection, crossover, and mutation.
Features
- Population Management - Maintain and evolve populations of neural networks
- Parallel Evaluation - Concurrent fitness evaluation using Erlang processes
- Sexual Reproduction - Uniform crossover of parent weights
- Mutation Operators - Weight perturbation with configurable rate/strength
- Selection Strategies - Top-N%, tournament, roulette wheel
- LTC Meta-Controller - Adaptive hyperparameter optimization using Liquid Time-Constant networks
- Lineage Tracking - Track parent1_id, parent2_id, generation_born
- Event Callbacks - Pluggable event handling for UI updates
- Target Fitness - Automatic stopping when fitness threshold is reached
The Liquid Conglomerate Vision
This library implements the first level of a hierarchical meta-learning system called the Liquid Conglomerate:
The Liquid Conglomerate is a novel architecture that uses hierarchical Liquid Time-Constant (LTC) neural networks to create a self-optimizing training system. Instead of manually tuning hyperparameters, the system learns how to learn at multiple timescales:
- Level 0 (fast tau): Task networks react to immediate domain state
- Level 1 (medium tau): Meta-controller adapts hyperparameters per generation
- Level 2+ (slow tau): Higher-order controllers learn meta-strategies
Key effects on training:
- Self-tuning hyperparameters - Mutation rate, selection ratio adapt automatically
- Automatic stagnation recovery - Detects and escapes local optima
- Phase-appropriate strategies - Different strategies for exploration vs exploitation
- Transfer of meta-knowledge - Training strategies can transfer across domains
See The Liquid Conglomerate Guide for the full explanation, or LTC Meta-Controller for implementation details.
Installation
Add to your rebar.config:
{deps, [
{macula_neuroevolution, "~> 0.12.0"}
]}.Quick Start
%% Define your evaluator module (implements neuroevolution_evaluator behaviour)
-module(my_evaluator).
-behaviour(neuroevolution_evaluator).
-export([evaluate/2]).
evaluate(Individual, Options) ->
Network = Individual#individual.network,
%% Run your domain-specific evaluation
Score = run_simulation(Network),
UpdatedIndividual = Individual#individual{
metrics = #{total_score => Score}
},
{ok, UpdatedIndividual}.
%% Start training
Config = #neuro_config{
population_size = 50,
selection_ratio = 0.20,
mutation_rate = 0.10,
mutation_strength = 0.3,
network_topology = {42, [16, 8], 6}, % 42 inputs, 2 hidden layers, 6 outputs
evaluator_module = my_evaluator
},
{ok, Pid} = neuroevolution_server:start_link(Config),
neuroevolution_server:start_training(Pid).Configuration
| Parameter | Default | Description |
|---|---|---|
population_size | 50 | Number of individuals |
evaluations_per_individual | 10 | Games/tests per individual per generation |
selection_ratio | 0.20 | Fraction of population that survives (top 20%) |
mutation_rate | 0.10 | Probability of mutating each weight |
mutation_strength | 0.3 | Magnitude of weight perturbation |
max_generations | infinity | Maximum generations to run |
network_topology | - | {InputSize, HiddenLayers, OutputSize} |
evaluator_module | - | Module implementing neuroevolution_evaluator |
evaluator_options | #{} | Options passed to evaluator |
event_handler | undefined | {Module, InitArg} for event notifications |
Event Handling
Subscribe to training events by providing an event handler:
-module(my_event_handler).
-export([handle_event/2]).
handle_event({generation_started, Gen}, _State) ->
io:format("Generation ~p started~n", [Gen]);
handle_event({generation_complete, Stats}, _State) ->
io:format("Generation ~p: Best=~.2f, Avg=~.2f~n",
[Stats#generation_stats.generation,
Stats#generation_stats.best_fitness,
Stats#generation_stats.avg_fitness]);
handle_event(_Event, _State) ->
ok.
%% Configure with event handler
Config = #neuro_config{
%% ... other options ...
event_handler = {my_event_handler, undefined}
}.Custom Evaluators
Implement the neuroevolution_evaluator behaviour:
-module(snake_game_evaluator).
-behaviour(neuroevolution_evaluator).
-export([evaluate/2, calculate_fitness/1]).
%% Required callback
evaluate(Individual, Options) ->
Network = Individual#individual.network,
NumGames = maps:get(games, Options, 10),
%% Play multiple games and aggregate results
Results = [play_game(Network) || _ <- lists:seq(1, NumGames)],
TotalScore = lists:sum([R#result.score || R <- Results]),
TotalTicks = lists:sum([R#result.ticks || R <- Results]),
Wins = length([R || R <- Results, R#result.won]),
UpdatedIndividual = Individual#individual{
metrics = #{
total_score => TotalScore,
total_ticks => TotalTicks,
wins => Wins
}
},
{ok, UpdatedIndividual}.
%% Optional callback for custom fitness calculation
calculate_fitness(Metrics) ->
Score = maps:get(total_score, Metrics, 0),
Ticks = maps:get(total_ticks, Metrics, 0),
Wins = maps:get(wins, Metrics, 0),
Score * 50.0 + Ticks / 50.0 + Wins * 2.0.Building
rebar3 compile
rebar3 eunit
rebar3 dialyzer
Academic References
Evolutionary Algorithms
Holland, J.H. (1975). Adaptation in Natural and Artificial Systems. MIT Press.
- Foundational text on genetic algorithms and evolutionary computation.
Goldberg, D.E. (1989). Genetic Algorithms in Search, Optimization, and Machine Learning. Addison-Wesley.
- Comprehensive coverage of genetic algorithm theory and practice.
Neuroevolution
Yao, X. (1999). Evolving Artificial Neural Networks. Proceedings of the IEEE, 87(9), 1423-1447.
- Comprehensive survey of neuroevolution approaches and taxonomy.
Stanley, K.O. & Miikkulainen, R. (2002). Evolving Neural Networks through Augmenting Topologies. Evolutionary Computation, 10(2), 99-127.
- NEAT paper introducing speciation for topology evolution (applicable to weight evolution too).
Sher, G.I. (2013). Handbook of Neuroevolution Through Erlang. Springer.
- Erlang-specific neuroevolution patterns and DXNN2 architecture.
Selection & Breeding
Miller, B.L. & Goldberg, D.E. (1995). Genetic Algorithms, Tournament Selection, and the Effects of Noise. Complex Systems, 9(3), 193-212.
- Tournament selection analysis and noise resistance.
Goldberg, D.E. & Richardson, J. (1987). Genetic Algorithms with Sharing for Multimodal Function Optimization. Proceedings of the Second International Conference on Genetic Algorithms.
- Fitness sharing and speciation for maintaining diversity.
Fitness Evaluation
- Jin, Y. (2005). A Comprehensive Survey of Fitness Approximation in Evolutionary Computation. Soft Computing, 9(1), 3-12.
- Survey of fitness approximation techniques for expensive evaluations.
Related Projects
Macula Ecosystem
macula_tweann - Neural network library with topology evolution, LTC neurons, and ONNX export. Core dependency for network creation and evaluation.
macula - HTTP/3 mesh networking platform enabling distributed neuroevolution across edge devices with NAT traversal.
Inspiration & Related Work
DXNN2 - Gene Sher's original TWEANN implementation in Erlang.
NEAT-Python - Popular Python NEAT implementation with extensive documentation.
OpenAI ES - OpenAI's evolution strategies implementation for RL.
EvoTorch - Modern PyTorch-based evolutionary algorithm library.
DEAP - Distributed Evolutionary Algorithms in Python.
Guides
Getting Started
- Overview - Architecture and core concepts
- Getting Started - Quick setup guide
- Custom Evaluators - Implement your own evaluator
Advanced Topics
- The Liquid Conglomerate - Hierarchical meta-learning explained
- LTC Meta-Controller - Adaptive hyperparameter optimization
- Cooperative Silos - Cross-silo communication in LC v2
- Topology Evolution Roadmap - TWEANN integration plans
- Interoperability - Export formats and cross-language deployment
- Inference Scenarios - Production deployment patterns
- Swarm Robotics - Coordinated autonomous systems
- Evolution Strategies - Pluggable evolution algorithms
License
Apache License 2.0
Links
- Documentation
- GitHub
- macula_tweann - Neural network library