gleam_synapses/fun

Types

The activation function a neuron can have. It can be used in the arguments of neural network’s creation.

pub type Fun =
  activation.Activation

Functions

pub fn identity() -> Activation

Identity is a linear function where the output is equal to the input.

pub fn leaky_re_lu() -> Activation

LeakyReLU gives a small proportion of x if x is negative and x otherwise.

pub fn sigmoid() -> Activation

Sigmoid takes any value as input and outputs values in the range of 0.0 to 1.0.

pub fn tanh() -> Activation

Tanh is similar to sigmoid, but outputs values in the range of -1.0 and 1.0.