functions (macula_tweann v0.18.1)

View Source

Activation and utility functions for neural computation.

This module provides activation functions used by neurons to transform aggregated input signals into output signals, plus utility functions for saturation and scaling.

Based on DXNN2 by Gene Sher ("Handbook of Neuroevolution through Erlang").

Activation Functions

Monotonic:

- tanh - Hyperbolic tangent, smooth, range [-1, 1]

- sigmoid - Logistic function, range [0, 1]

- sigmoid1 - Alternative sigmoid, range [-1, 1]

- linear - Identity function (no transformation)

Periodic:

- sin - Sine function

- cos - Cosine function

Radial Basis:

- gaussian - Bell curve, peaks at 0

- multiquadric - sqrt(x^2 + c)

Threshold:

- sgn - Sign function {-1, 0, 1}

- bin - Binary threshold {0, 1}

- trinary - Three-level output {-1, 0, 1}

Other:

- absolute - Absolute value

- quadratic - Signed square

- sqrt - Signed square root

- log - Signed logarithm

Extensions (not in original DXNN2)

- cubic - Cube function x^3

- relu - Rectified Linear Unit max(0, x)

Utility Functions

- saturation/1,2 - Clamp values to prevent overflow

- sat/3 - Clamp to [min, max] range

- sat_dzone/5 - Saturation with dead zone

- scale/3,5 - Scale values between ranges

Summary

Functions

Absolute value activation function

Calculate average of a list

Binary threshold function

Cosine activation function

Cubic activation function

Gaussian (radial basis) activation function

Gaussian with custom base

Linear (identity) activation function

Logarithm activation function

Multiquadric activation function

Quadratic activation function

Rectified Linear Unit (ReLU) activation function

Clamp value to range [Min, Max]

Clamp value with dead zone

Clamp value to default range [-1000, 1000]

Clamp value to symmetric range [-Spread, Spread]

Scale list or value from one range to [-1, 1]

Scale value from one range to another

Sign function

Alternative sigmoid activation function

Sigmoid (logistic) activation function

Sine activation function

Square root activation function

Calculate standard deviation of a list

Hyperbolic tangent activation function

Trinary threshold function

Functions

absolute(Val)

-spec absolute(number()) -> number().

Absolute value activation function

avg(List)

-spec avg([number()]) -> float().

Calculate average of a list

bin(Val)

-spec bin(number()) -> 0 | 1.

Binary threshold function

Returns 1 for positive input, 0 otherwise.

cos(Val)

-spec cos(float()) -> float().

Cosine activation function

Periodic function with range [-1, 1]. cos(0) = 1, phase-shifted from sine.

cubic(Val)

-spec cubic(float()) -> float().

Cubic activation function

Signed cube: x^3 Preserves sign while strongly amplifying magnitude.

gaussian(Val)

-spec gaussian(float()) -> float().

Gaussian (radial basis) activation function

Bell curve centered at 0, peaks at 1.0, decays towards 0. Mathematical definition: e^(-x^2)

Input is clamped to [-10, 10] to prevent underflow.

Properties: - Output range: (0, 1] - gaussian(0) = 1 - Radially symmetric

gaussian(Base, Val)

-spec gaussian(float(), float()) -> float().

Gaussian with custom base

linear(Val)

-spec linear(float()) -> float().

Linear (identity) activation function

No transformation - output equals input. Used for output neurons when raw values are needed.

log(Val)

-spec log(number()) -> float().

Logarithm activation function

Signed logarithm: sgn(x) * ln(|x|) Handles zero input specially.

multiquadric(Val)

-spec multiquadric(float()) -> float().

Multiquadric activation function

Mathematical definition: sqrt(x^2 + 0.01) Always positive, smooth at origin.

quadratic(Val)

-spec quadratic(float()) -> float().

Quadratic activation function

Signed square: sgn(x) * x^2 Preserves sign while amplifying magnitude.

relu(Val)

-spec relu(float()) -> float().

Rectified Linear Unit (ReLU) activation function

Returns max(0, x). Popular in deep learning for its simplicity and effectiveness.

sat(Val, Max, Min)

-spec sat(number(), number(), number()) -> number().

Clamp value to range [Min, Max]

sat_dzone(Val, Max, Min, DZMax, DZMin)

-spec sat_dzone(number(), number(), number(), number(), number()) -> number().

Clamp value with dead zone

Values within the dead zone [DZMin, DZMax] are set to 0. Values outside are clamped to [Min, Max].

saturation(Val)

-spec saturation(number()) -> number().

Clamp value to default range [-1000, 1000]

Prevents numerical overflow in subsequent calculations.

saturation(Val, Spread)

-spec saturation(number(), number()) -> number().

Clamp value to symmetric range [-Spread, Spread]

scale(T, Max, Min)

-spec scale([number()] | number(), number(), number()) -> [float()] | float().

Scale list or value from one range to [-1, 1]

Normalizes values using: (Val*2 - (Max + Min)) / (Max - Min)

scale(Val, FromMin, FromMax, ToMin, ToMax)

-spec scale(number(), number(), number(), number(), number()) -> float().

Scale value from one range to another

Linear interpolation from [FromMin, FromMax] to [ToMin, ToMax].

sgn(Val)

-spec sgn(number()) -> -1 | 0 | 1.

Sign function

Returns the sign of the input value.

sigmoid1(Val)

-spec sigmoid1(float()) -> float().

Alternative sigmoid activation function

Maps to range [-1, 1] using: x / (1 + |x|)

Properties: - Output range: (-1, 1) - sigmoid1(0) = 0 - Faster to compute than standard sigmoid - Derivative: 1 / (1 + |x|)^2

sigmoid(Val)

-spec sigmoid(float()) -> float().

Sigmoid (logistic) activation function

S-shaped curve mapping to range [0, 1]. Mathematical definition: 1 / (1 + e^-x)

Input is clamped to [-10, 10] to prevent overflow.

Properties: - Output range: (0, 1) - sigmoid(0) = 0.5 - Derivative: y * (1 - y)

sin(Val)

-spec sin(float()) -> float().

Sine activation function

Periodic function with range [-1, 1]. Useful for oscillatory patterns and fourier-like representations.

sqrt(Val)

-spec sqrt(float()) -> float().

Square root activation function

Signed square root: sgn(x) * sqrt(|x|) Compresses magnitude while preserving sign.

std(List)

-spec std([number()]) -> float().

Calculate standard deviation of a list

tanh(Val)

-spec tanh(float()) -> float().

Hyperbolic tangent activation function

Maps input to range [-1, 1] with smooth gradient. Mathematical definition: tanh(x) = (e^x - e^-x) / (e^x + e^-x)

Properties: - Output range: [-1, 1] - tanh(0) = 0 - Smooth derivative (good for learning) - Most commonly used activation in neuroevolution

trinary(Val)

-spec trinary(float()) -> -1 | 0 | 1.

Trinary threshold function

Returns -1, 0, or 1 based on threshold of 0.33.