functions (macula_tweann v0.18.1)
View SourceActivation and utility functions for neural computation.
This module provides activation functions used by neurons to transform aggregated input signals into output signals, plus utility functions for saturation and scaling.
Based on DXNN2 by Gene Sher ("Handbook of Neuroevolution through Erlang").
Activation Functions
Monotonic:
- tanh - Hyperbolic tangent, smooth, range [-1, 1]
- sigmoid - Logistic function, range [0, 1]
- sigmoid1 - Alternative sigmoid, range [-1, 1]
- linear - Identity function (no transformation)
Periodic:
- sin - Sine function
- cos - Cosine function
Radial Basis:
- gaussian - Bell curve, peaks at 0
- multiquadric - sqrt(x^2 + c)
Threshold:
- sgn - Sign function {-1, 0, 1}
- bin - Binary threshold {0, 1}
- trinary - Three-level output {-1, 0, 1}
Other:
- absolute - Absolute value
- quadratic - Signed square
- sqrt - Signed square root
- log - Signed logarithm
Extensions (not in original DXNN2)
- cubic - Cube function x^3
- relu - Rectified Linear Unit max(0, x)
Utility Functions
- saturation/1,2 - Clamp values to prevent overflow
- sat/3 - Clamp to [min, max] range
- sat_dzone/5 - Saturation with dead zone
- scale/3,5 - Scale values between ranges
Summary
Functions
Absolute value activation function
Calculate average of a list
Binary threshold function
Cosine activation function
Cubic activation function
Gaussian (radial basis) activation function
Gaussian with custom base
Linear (identity) activation function
Logarithm activation function
Multiquadric activation function
Quadratic activation function
Rectified Linear Unit (ReLU) activation function
Clamp value to range [Min, Max]
Clamp value with dead zone
Clamp value to default range [-1000, 1000]
Clamp value to symmetric range [-Spread, Spread]
Scale list or value from one range to [-1, 1]
Scale value from one range to another
Sign function
Alternative sigmoid activation function
Sigmoid (logistic) activation function
Sine activation function
Square root activation function
Calculate standard deviation of a list
Hyperbolic tangent activation function
Trinary threshold function
Functions
Absolute value activation function
Calculate average of a list
-spec bin(number()) -> 0 | 1.
Binary threshold function
Returns 1 for positive input, 0 otherwise.
Cosine activation function
Periodic function with range [-1, 1]. cos(0) = 1, phase-shifted from sine.
Cubic activation function
Signed cube: x^3 Preserves sign while strongly amplifying magnitude.
Gaussian (radial basis) activation function
Bell curve centered at 0, peaks at 1.0, decays towards 0. Mathematical definition: e^(-x^2)
Input is clamped to [-10, 10] to prevent underflow.
Properties: - Output range: (0, 1] - gaussian(0) = 1 - Radially symmetric
Gaussian with custom base
Linear (identity) activation function
No transformation - output equals input. Used for output neurons when raw values are needed.
Logarithm activation function
Signed logarithm: sgn(x) * ln(|x|) Handles zero input specially.
Multiquadric activation function
Mathematical definition: sqrt(x^2 + 0.01) Always positive, smooth at origin.
Quadratic activation function
Signed square: sgn(x) * x^2 Preserves sign while amplifying magnitude.
Rectified Linear Unit (ReLU) activation function
Returns max(0, x). Popular in deep learning for its simplicity and effectiveness.
Clamp value to range [Min, Max]
Clamp value with dead zone
Values within the dead zone [DZMin, DZMax] are set to 0. Values outside are clamped to [Min, Max].
Clamp value to default range [-1000, 1000]
Prevents numerical overflow in subsequent calculations.
Clamp value to symmetric range [-Spread, Spread]
Scale list or value from one range to [-1, 1]
Normalizes values using: (Val*2 - (Max + Min)) / (Max - Min)
Scale value from one range to another
Linear interpolation from [FromMin, FromMax] to [ToMin, ToMax].
-spec sgn(number()) -> -1 | 0 | 1.
Sign function
Returns the sign of the input value.
Alternative sigmoid activation function
Maps to range [-1, 1] using: x / (1 + |x|)
Properties: - Output range: (-1, 1) - sigmoid1(0) = 0 - Faster to compute than standard sigmoid - Derivative: 1 / (1 + |x|)^2
Sigmoid (logistic) activation function
S-shaped curve mapping to range [0, 1]. Mathematical definition: 1 / (1 + e^-x)
Input is clamped to [-10, 10] to prevent overflow.
Properties: - Output range: (0, 1) - sigmoid(0) = 0.5 - Derivative: y * (1 - y)
Sine activation function
Periodic function with range [-1, 1]. Useful for oscillatory patterns and fourier-like representations.
Square root activation function
Signed square root: sgn(x) * sqrt(|x|) Compresses magnitude while preserving sign.
Calculate standard deviation of a list
Hyperbolic tangent activation function
Maps input to range [-1, 1] with smooth gradient. Mathematical definition: tanh(x) = (e^x - e^-x) / (e^x + e^-x)
Properties: - Output range: [-1, 1] - tanh(0) = 0 - Smooth derivative (good for learning) - Most commonly used activation in neuroevolution
-spec trinary(float()) -> -1 | 0 | 1.
Trinary threshold function
Returns -1, 0, or 1 based on threshold of 0.33.