signal_aggregator (macula_tweann v0.18.1)
View SourceSignal aggregation functions for neural computation.
This module provides functions that aggregate weighted inputs from multiple sources into a single scalar value for activation function processing.
Aggregation Methods
- dot_product - Standard weighted sum (most common)
- mult_product - Multiplicative aggregation
- diff_product - Differentiation-based aggregation (uses process dictionary)
Weight Tuple Format
Weights are provided as tuples: {Weight, DeltaWeight, LearningRate, ParamList}
- Weight: The actual weight value used for computation
- DeltaWeight: Momentum term (ignored here, used by plasticity)
- LearningRate: Learning parameter (ignored here)
- ParamList: Additional parameters for plasticity rules (ignored here)
Only the Weight value is used for aggregation. The other fields support the plasticity system for weight updates during learning.
Summary
Functions
Compute differentiation-based product of inputs
Compute dot product of inputs and weights
NIF-accelerated dot product when available.
Flatten nested signal/weight structure for NIF consumption.
Compute multiplicative product of inputs and weights
Types
-type actuator_id() :: {unique_id(), actuator}.
-type cortex_id() :: {unique_id(), cortex}.
-type delta_weight() :: float().
-type element_id() :: neuron_id() | sensor_id() | actuator_id() | cortex_id().
-type input_signal() :: {element_id(), [float()]}.
-type input_signals() :: [input_signal()].
-type learning_rate() :: float().
-type neuron_id() :: {unique_id(), neuron}.
-type parameter_list() :: [float()].
-type sensor_id() :: {unique_id(), sensor}.
-type weight() :: float().
-type weight_list() :: [weight_spec()].
-type weight_spec() :: {weight(), delta_weight(), learning_rate(), parameter_list()}.
-type weighted_input() :: {element_id(), weight_list()}.
-type weighted_inputs() :: [weighted_input()].
Functions
-spec diff_product(input_signals(), weighted_inputs()) -> float().
Compute differentiation-based product of inputs
Uses the difference between current and previous inputs, then applies dot product aggregation. This implements temporal differentiation for detecting changes in input signals.
Warning: This function uses the process dictionary to store previous input state. On first call, behaves like regular dot_product.
-spec dot_product(input_signals(), weighted_inputs()) -> float().
Compute dot product of inputs and weights
For each input source, multiplies each input signal component by its corresponding weight and sums all results. This is the standard weighted sum aggregation used in most neural networks.
The bias term is handled specially - if present as the last weight entry with source ID 'bias', its weight is added directly to the result.
Weight tuple format: {W, DW, LP, LPs} W - Weight value (used for computation) DW - Delta weight (ignored here, used by plasticity) LP - Learning parameter (ignored here) LPs - Parameter list (ignored here)
-spec dot_product_nif(input_signals(), weighted_inputs()) -> float().
NIF-accelerated dot product when available.
This function attempts to use the Rust NIF for dot product computation. Falls back to pure Erlang if NIF is not loaded.
Performance: 40-100x faster than pure Erlang for N > 10 inputs.
-spec flatten_for_nif(input_signals(), weighted_inputs()) -> {[float()], [float()], float()}.
Flatten nested signal/weight structure for NIF consumption.
Converts from: Signals: [{SourceId, [S1, S2, ...]}, ...] Weights: [{SourceId, [{W1, DW1, LP1, []}, {W2, DW2, LP2, []}, ...]}, ...]
To: FlatSignals: [S1, S2, S3, ...] FlatWeights: [W1, W2, W3, ...] Bias: float()
The flattened format is cache-friendly and suitable for SIMD vectorization in the Rust NIF.
-spec mult_product(input_signals(), weighted_inputs()) -> float().
Compute multiplicative product of inputs and weights
For each input source, multiplies each input signal component by its corresponding weight, then multiplies all these products together. Useful for AND-like logic in neural networks.
Note: Any zero input will result in zero output due to multiplication.