Topology Evolution Roadmap
View SourceThis document outlines the planned integration of full TWEANN (Topology and Weight Evolving Artificial Neural Networks) capabilities into macula_neuroevolution, and how the Liquid Conglomerate meta-learning architecture will control topology evolution.
Current State
Currently, macula_neuroevolution uses fixed-topology networks where:
- Network structure is defined at initialization:
{InputSize, HiddenLayers, OutputSize} - Only weights are evolved through crossover and mutation
- All individuals in the population have identical topology
The underlying macula_tweann library already supports full topology evolution operators:
| Operator | Description |
|---|---|
add_neuron/1 | Insert neuron into existing connection |
add_outlink/1 | Add output connection from neuron |
add_inlink/1 | Add input connection to neuron |
outsplice/1 | Split output connection with new neuron |
add_sensorlink/1 | Connect sensor to neuron |
add_actuatorlink/1 | Connect neuron to actuator |
add_bias/1 | Add bias connection to neuron |
add_sensor/1 | Add new sensor (planned) |
add_actuator/1 | Add new actuator (planned) |
NEAT-Inspired Topology Evolution
The planned topology evolution follows the NEAT (NeuroEvolution of Augmenting Topologies) approach by Stanley and Miikkulainen (2002):
Core Principles
- Start Minimal - Networks begin with minimal structure (direct input-output connections)
- Complexify Gradually - Structure grows through mutations over generations
- Protect Innovation - New structures need time to optimize their weights
- Speciation - Similar topologies compete primarily with each other
Innovation Numbers
Each structural change (new connection, new node) receives a unique innovation number:
Connection Gene: {from: 1, to: 4, weight: 0.5, enabled: true, innovation: 42}Innovation numbers enable:
- Alignment during crossover (match genes by innovation)
- Distance calculation for speciation
- Historical tracking of structural changes
Liquid Conglomerate Integration
The Liquid Conglomerate meta-controller will extend to control topology evolution parameters dynamically.
Extended Meta-Controller Outputs
Current outputs (weight evolution):
mutation_rate- Probability of weight mutationmutation_strength- Magnitude of weight perturbationselection_ratio- Survival ratio per generation
New outputs (topology evolution):
topology_mutation_rate- Probability of structural mutationadd_neuron_rate- Relative probability of adding neuronsadd_connection_rate- Relative probability of adding connectionsremove_connection_rate- Relative probability of removing connectionscomplexity_penalty- Fitness penalty per structural element
Extended Meta-Controller Inputs
Current inputs (population metrics):
- Best/average/worst fitness
- Fitness improvement rate
- Population diversity
- Stagnation count
New inputs (topology metrics):
- Average network complexity (neuron count, connection count)
- Complexity variance across population
- Topology diversity (unique structures)
- Innovation frequency
- Species count and health
Adaptive Topology Control
The meta-controller learns when to:
Encourage Complexification
- Fitness plateaus but hasn't peaked
- High diversity in weights but not structure
- Problem appears to need more representational capacity
Discourage Complexification
- Networks growing without fitness improvement
- Over-fitting indicators (train/test divergence)
- Computational budget concerns
Encourage Simplification
- Prune unused connections
- Remove redundant neurons
- Regularize toward minimal effective complexity
Speciation with Liquid Conglomerate
Species as Sub-Populations
Each species maintains its own population dynamics:
Meta-Controller Per Species
The Liquid Conglomerate hierarchy could extend to species-level control:
This enables different evolutionary strategies for different topological niches.
Crossover with Variable Topology
When crossing individuals with different topologies:
Gene Alignment
Compatibility Distance
compatibility_distance(Genome1, Genome2, Config) ->
{Excess, Disjoint, WeightDiff} = compare_genomes(Genome1, Genome2),
N = max(genome_size(Genome1), genome_size(Genome2)),
Config#compat_config.c1 * Excess / N +
Config#compat_config.c2 * Disjoint / N +
Config#compat_config.c3 * WeightDiff.Implementation Phases
Phase 1: Foundation
- Add innovation number tracking to networks
- Implement gene-based genome representation
- Add compatibility distance calculation
- Basic speciation without meta-control
Phase 2: Topology Operators
- Integrate
add_neuron,add_outlink,add_inlinkoperators - Implement connection enable/disable
- Add crossover for variable topologies
- Implement speciation dynamics
Phase 3: Meta-Controller Extension
- Extend meta-controller inputs (complexity metrics)
- Add topology evolution outputs
- Train meta-controller on topology-aware reward signal
- Implement adaptive complexity penalties
Phase 4: Species-Level Meta-Control
- Per-species meta-controllers
- Inter-species resource allocation
- Hierarchical Liquid Conglomerate for species
Expected Benefits
From Topology Evolution
- Automatic Architecture Discovery - No manual network design needed
- Minimal Complexity Bias - Solutions grow only as complex as needed
- Diverse Strategies - Different topologies for different sub-problems
- Incremental Building - Complex solutions built on simpler precursors
From Liquid Conglomerate Control
- Adaptive Complexification - Grow structure when needed, not randomly
- Phase-Appropriate Strategies - Explore structure early, refine late
- Automatic Regularization - Meta-learned complexity penalties
- Transfer of Meta-Knowledge - Learn "how to evolve topology" across domains
Relation to DXNN2
This roadmap draws heavily from Gene Sher's DXNN2 architecture (described in "Handbook of Neuroevolution Through Erlang"). Key concepts from DXNN2:
- Morphology-based sensor/actuator specification
- Constraint records for controlling mutation operators
- Substrate encoding for hypercube geometry
- Exoself architecture for agent lifecycle
The Liquid Conglomerate extends DXNN2's concepts by adding:
- Hierarchical meta-learning at multiple timescales
- LTC neurons for continuous temporal dynamics
- Adaptive tau for timescale self-organization
- Species-level meta-control
References
Stanley, K.O. & Miikkulainen, R. (2002). Evolving Neural Networks through Augmenting Topologies. Evolutionary Computation, 10(2), 99-127.
Sher, G.I. (2013). Handbook of Neuroevolution Through Erlang. Springer.
Hasani, R. et al. (2021). Liquid Time-constant Networks. AAAI.
Next Steps
See The Liquid Conglomerate for the full meta-learning theory, or LTC Meta-Controller for current implementation details.