Diagram Index
View SourceVisual guides to understanding TWEANN concepts and architecture.
Core Concepts
TWEANN Structure
The fundamental architecture of a Topology and Weight Evolving Artificial Neural Network.
Key elements:
- Sensors (green): Input layer receiving environmental signals
- Hidden neurons (blue): Processing nodes with evolving topology
- Actuators (orange): Output layer producing actions
- Connections: Weighted links that can be added/removed through evolution
- Cortex: Coordinator process managing sync cycles
Neuroevolution Cycle
The iterative process of population-based optimization.
The four phases:
- Population: Collection of neural network genotypes
- Evaluation: Run each phenotype, calculate fitness scores
- Selection: Tournament or truncation selection of survivors
- Reproduction: Mutation, crossover, and elitism to create next generation
NEAT Evolution
NeuroEvolution of Augmenting Topologies - how structure evolves.
Key innovations:
- add_node: Split a connection to insert a new neuron
- add_link: Create a new connection between existing nodes
- Innovation numbers: Historical markings for meaningful crossover
- Speciation: Group similar topologies to protect innovation
Architecture
Genotype to Phenotype
Transformation from genetic encoding to living neural network.
The Constructor pattern:
- Genotype (left): Records stored in ETS (agent, cortex, sensor, actuator, neuron)
- Phenotype (right): Concurrent Erlang processes communicating via messages
- Each neuron becomes a
gen_serverwith its own state and plasticity
Supervision Tree
OTP supervision hierarchy for fault tolerance.
Module Dependencies
How the library modules relate to each other.
C4 Architecture Model
Software architecture using the C4 model.
Context Diagram
Container Diagram
Component Diagram
LTC Neurons
LTC Neuron Architecture
Liquid Time-Constant neurons with temporal dynamics.
Components:
- Time constant (tau) controls adaptation speed
- Closed-form approximation (CfC) for efficient computation
- Temporal memory through leaky integration
LTC vs Standard Neurons
Comparison between LTC and traditional neurons.
Learning Mechanisms
Neural Plasticity
Online weight learning during the network's lifetime.
Plasticity rules:
- Hebbian: "Cells that fire together, wire together"
- Oja's Rule: Self-normalizing Hebbian (performs online PCA)
- Self-Modulation: Neuron controls its own learning parameters
- Neuromodulation: External reward signal gates learning
Activation Functions
The transfer functions available for neurons.
Function families:
- Bounded: sigmoid, tanh (classification, gates)
- Unbounded: relu, softplus (deep networks, sparse activation)
- Periodic: sin, cos (rhythmic patterns, CPGs)
- Localized: gaussian (RBF networks, pattern recognition)
- Binary: step, sgn (threshold logic)
Mutation Sequence
Step-by-step mutation process.
Evaluation Cycle Sequence
Detailed evaluation workflow.
Distributed Evolution
Distributed Evolution Model
Multi-node evolution architecture.
Federated Populations Model
Island-based distributed populations.
Swarm Evolution Model
Swarm intelligence with evolved controllers.
Mega-Brain Architecture
Large-scale distributed neural architectures.
Application Domains
Military Swarm Coordination
Autonomous swarm systems.
Civil Infrastructure Resilience
Infrastructure protection applications.
Counter-Drone Detection Fusion
Multi-sensor fusion for detection.
Layered Defense Zones
Multi-layer defense architecture.
See Also
- TWEANN Basics - Introduction to neuroevolution concepts
- Architecture Details - In-depth system architecture
- LTC Neurons - Liquid Time-Constant neuron guide
- Quick Start - Get started with examples