network_evaluator (macula_tweann v0.18.1)

View Source

Synchronous neural network evaluator for inference.

This module provides synchronous (blocking) forward propagation for neural networks. Unlike the process-based cortex/neuron approach used during training, this is designed for fast inference in real-time applications like games.

Usage

Create a network from a genotype: {ok, Network} = network_evaluator:from_genotype(AgentId)

Or create a simple feedforward network: Network = network_evaluator:create_feedforward(42, [16, 8], 6)

Evaluate: Outputs = network_evaluator:evaluate(Network, Inputs)

Summary

Functions

Compile network for NIF acceleration.

Create a feedforward network with random weights.

Create a feedforward network with specified activation.

Evaluate the network with given inputs.

Evaluate network and return all layer activations.

Deserialize a network from binary.

Load a network from a genotype stored in Mnesia.

Deserialize a network from a JSON-compatible map.

Get network topology information for visualization.

Get visualization data for rendering the network.

Get all weights from the network as a flat list.

Set weights from a flat list.

Strip the compiled_ref from a network to release NIF memory.

Serialize a network to binary using Erlang term format.

Serialize a network to a JSON-compatible map.

Types

layer/0

-type layer() :: {Weights :: [[float()]], Biases :: [float()]}.

network/0

-type network() ::
          #network{layers :: [layer()], activation :: atom(), compiled_ref :: reference() | undefined}.

Functions

compile_for_nif(Network)

-spec compile_for_nif(network()) -> network().

Compile network for NIF acceleration.

If the NIF is loaded, compiles the network to a flat representation that can be evaluated much faster. Falls back to Erlang evaluation if NIF is not available.

WARNING: Use sparingly! Each compiled network holds a Rust ResourceArc reference that keeps native memory alive. During neuroevolution, do NOT compile networks automatically (especially in create_feedforward or set_weights) as this causes massive memory leaks - one compiled_ref per offspring per generation accumulates unboundedly.

Only call this when you need maximum performance for a specific network that will be evaluated many times (e.g., the final champion network).

create_feedforward(InputSize, HiddenSizes, OutputSize)

-spec create_feedforward(pos_integer(), [pos_integer()], pos_integer()) -> network().

Create a feedforward network with random weights.

create_feedforward(InputSize, HiddenSizes, OutputSize, Activation)

-spec create_feedforward(pos_integer(), [pos_integer()], pos_integer(), atom()) -> network().

Create a feedforward network with specified activation.

evaluate(Network, Inputs)

-spec evaluate(network(), [float()]) -> [float()].

Evaluate the network with given inputs.

Performs synchronous forward propagation through all layers. Uses NIF acceleration if available and network was compiled.

evaluate_with_activations(Network, Inputs)

-spec evaluate_with_activations(network(), [float()]) ->
                                   {Outputs :: [float()], Activations :: [[float()]]}.

Evaluate network and return all layer activations.

Returns {Outputs, AllActivations} where AllActivations is a list of activation vectors for each layer (including input and output).

from_binary(Binary)

-spec from_binary(binary()) -> {ok, network()} | {error, term()}.

Deserialize a network from binary.

from_genotype(AgentId)

-spec from_genotype(term()) -> {ok, network()} | {error, term()}.

Load a network from a genotype stored in Mnesia.

Reads the agent's neural network structure and weights from Mnesia and creates an evaluator network.

from_json(_)

-spec from_json(map()) -> {ok, network()} | {error, term()}.

Deserialize a network from a JSON-compatible map.

Accepts the format produced by to_json/1.

get_topology(Network)

-spec get_topology(network()) -> map().

Get network topology information for visualization.

Returns a map with layer sizes for rendering the network structure.

get_viz_data(Network, Inputs, InputLabels)

-spec get_viz_data(network(), [float()], [binary()]) -> map().

Get visualization data for rendering the network.

Combines topology, weights, and activations into a format suitable for frontend visualization.

get_weights(Network)

-spec get_weights(network()) -> [float()].

Get all weights from the network as a flat list.

Useful for evolution - can be mutated and set back.

set_weights(Network, FlatWeights)

-spec set_weights(network(), [float()]) -> network().

Set weights from a flat list.

The list must have the same number of elements as returned by get_weights/1. NOTE: Does NOT compile for NIF - this prevents memory leaks during evolution.

strip_compiled_ref(Network)

-spec strip_compiled_ref(Network :: network() | map() | term()) -> network() | map() | term().

Strip the compiled_ref from a network to release NIF memory.

IMPORTANT: Call this before storing networks long-term (archives, events) to prevent NIF ResourceArc references from accumulating and causing memory leaks. The compiled_ref is a Rust ResourceArc that holds native memory - keeping references alive prevents the memory from being freed.

The network can be recompiled on-demand when needed for evaluation.

to_binary(Network)

-spec to_binary(network()) -> binary().

Serialize a network to binary using Erlang term format.

This is more compact than JSON and preserves exact floating point values. Use this for Erlang-to-Erlang transfer or storage.

to_json(Network)

-spec to_json(network()) -> map().

Serialize a network to a JSON-compatible map.

The output format is suitable for JSON encoding and can be loaded in other runtimes (Python, JavaScript, etc.) for inference.

Format: A map with keys "version", "activation", and "layers". The layers list contains maps with "weights" and "biases" keys.