View Source Axon.Initializers (Axon v0.3.1)
Parameter initializers.
Parameter initializers are used to initialize the weights and biases of a neural network. Because most deep learning optimization algorithms are iterative, they require an initial point to iterate from.
Sometimes the initialization of a model can determine whether or not a model converges. In some cases, the initial point is unstable, and therefore the model has no chance of converging using common first-order optimization methods. In cases where the model will converge, initialization can have a significant impact on how quickly the model converges.
Most initialization strategies are built from intuition and heuristics rather than theory. It's commonly accepted that the parameters of different layers should be different - motivating the use of random initialization for each layer's parameters. Usually, only the weights of a layer are initialized using a random distribution - while the biases are initialized to a uniform constant (like 0).
Most initializers use Gaussian (normal) or uniform distributions with variations on scale. The output scale of an initializer should generally be large enough to avoid information loss but small enough to avoid exploding values. The initializers in this module have a default scale known to work well with the initialization strategy.
The functions in this module return initialization functions which take shapes and types and return tensors:
init_fn = Axon.Initializers.zeros()
init_fn.({1, 2}, {:f, 32})
You may use these functions from within defn
or outside.
Link to this section Summary
Functions
Initializes parameters to value.
Initializes parameters with the Glorot normal initializer.
Initializes parameters with the Glorot uniform initializer.
Initializes parameters with the He normal initializer.
Initializes parameters with the He uniform initializer.
Initializes parameters to an identity matrix.
Initializes parameters with the Lecun normal initializer.
Initializes parameters with the Lecun uniform initializer.
Initializes parameters with a random normal distribution.
Initializes parameters to 1.
Initializes a tensor with an orthogonal distribution.
Initializes parameters with a random uniform distribution.
Initializes parameters with variance scaling according to the given distribution and mode.
Initializes parameters to 0.
Link to this section Functions
Initializes parameters to value.
examples
Examples
iex> init_fn = Axon.Initializers.full(1.00)
iex> out = init_fn.({2, 2}, {:f, 32})
iex> out
#Nx.Tensor<
f32[2][2]
[
[1.0, 1.0],
[1.0, 1.0]
]
>
Initializes parameters with the Glorot normal initializer.
The Glorot normal initializer is equivalent to calling
Axon.Initializers.variance_scaling
with mode: :fan_avg
and distribution: :truncated_normal
.
The Glorot normal initializer is also called the Xavier normal initializer.
options
Options
:scale
- scale of the output distribution. Defaults to1.0
examples
Examples
iex> init_fn = Axon.Initializers.glorot_normal()
iex> t = init_fn.({2, 2}, {:f, 32}, Nx.Random.key(1))
iex> Nx.shape(t)
{2, 2}
iex> Nx.type(t)
{:f, 32}
iex> init_fn = Axon.Initializers.glorot_normal(scale: 1.0e-3)
iex> t = init_fn.({2, 2}, {:bf, 16}, Nx.Random.key(1))
iex> Nx.shape(t)
{2, 2}
iex> Nx.type(t)
{:bf, 16}
references
References
Initializes parameters with the Glorot uniform initializer.
The Glorot uniform initializer is equivalent to calling
Axon.Initializers.variance_scaling
with mode: :fan_avg
and distribution: :uniform
.
The Glorot uniform initializer is also called the Xavier uniform initializer.
options
Options
:scale
- scale of the output distribution. Defaults to1.0
examples
Examples
iex> init_fn = Axon.Initializers.glorot_uniform()
iex> t = init_fn.({2, 2}, {:f, 32}, Nx.Random.key(1))
iex> Nx.shape(t)
{2, 2}
iex> Nx.type(t)
{:f, 32}
iex> init_fn = Axon.Initializers.glorot_uniform(scale: 1.0e-3)
iex> t = init_fn.({2, 2}, {:bf, 16}, Nx.Random.key(1))
iex> Nx.shape(t)
{2, 2}
iex> Nx.type(t)
{:bf, 16}
references
References
Initializes parameters with the He normal initializer.
The He normal initializer is equivalent to calling
Axon.Initializers.variance_scaling
with mode: :fan_in
and distribution: :truncated_normal
.
options
Options
:scale
- scale of the output distribution. Defaults to2.0
examples
Examples
iex> init_fn = Axon.Initializers.he_normal()
iex> t = init_fn.({2, 2}, {:f, 32}, Nx.Random.key(1))
iex> Nx.shape(t)
{2, 2}
iex> Nx.type(t)
{:f, 32}
iex> init_fn = Axon.Initializers.he_normal(scale: 1.0e-3)
iex> t = init_fn.({2, 2}, {:bf, 16}, Nx.Random.key(1))
iex> Nx.shape(t)
{2, 2}
iex> Nx.type(t)
{:bf, 16}
references
References
Initializes parameters with the He uniform initializer.
The He uniform initializer is equivalent to calling
Axon.Initializers.variance_scaling
with mode: :fan_ni
and distribution: :uniform
.
options
Options
:scale
- scale of the output distribution. Defaults to2.0
examples
Examples
iex> init_fn = Axon.Initializers.he_uniform()
iex> t = init_fn.({2, 2}, {:f, 32}, Nx.Random.key(1))
iex> Nx.shape(t)
{2, 2}
iex> Nx.type(t)
{:f, 32}
iex> init_fn = Axon.Initializers.he_uniform(scale: 1.0e-3)
iex> t = init_fn.({2, 2}, {:bf, 16}, Nx.Random.key(1))
iex> Nx.shape(t)
{2, 2}
iex> Nx.type(t)
{:bf, 16}
references
References
Initializes parameters to an identity matrix.
examples
Examples
iex> init_fn = Axon.Initializers.identity()
iex> out = init_fn.({2, 2}, {:f, 32})
iex> out
#Nx.Tensor<
f32[2][2]
[
[1.0, 0.0],
[0.0, 1.0]
]
>
Initializes parameters with the Lecun normal initializer.
The Lecun normal initializer is equivalent to calling
Axon.Initializers.variance_scaling
with mode: :fan_in
and distribution: :truncated_normal
.
options
Options
:scale
- scale of the output distribution. Defaults to1.0
examples
Examples
iex> init_fn = Axon.Initializers.lecun_normal()
iex> t = init_fn.({2, 2}, {:f, 32}, Nx.Random.key(1))
iex> Nx.shape(t)
{2, 2}
iex> Nx.type(t)
{:f, 32}
iex> init_fn = Axon.Initializers.lecun_normal(scale: 1.0e-3)
iex> t = init_fn.({2, 2}, {:bf, 16}, Nx.Random.key(1))
iex> Nx.shape(t)
{2, 2}
iex> Nx.type(t)
{:bf, 16}
references
References
Initializes parameters with the Lecun uniform initializer.
The Lecun uniform initializer is equivalent to calling
Axon.Initializers.variance_scaling
with mode: :fan_in
and distribution: :uniform
.
options
Options
:scale
- scale of the output distribution. Defaults to1.0
examples
Examples
iex> init_fn = Axon.Initializers.lecun_uniform()
iex> t = init_fn.({2, 2}, {:f, 32}, Nx.Random.key(1))
iex> Nx.shape(t)
{2, 2}
iex> Nx.type(t)
{:f, 32}
iex> init_fn = Axon.Initializers.lecun_uniform(scale: 1.0e-3)
iex> t = init_fn.({2, 2}, {:bf, 16}, Nx.Random.key(1))
iex> Nx.shape(t)
{2, 2}
iex> Nx.type(t)
{:bf, 16}
references
References
Initializes parameters with a random normal distribution.
options
Options
:mean
- mean of the output distribution. Defaults to0.0
:scale
- scale of the output distribution. Defaults to1.0e-2
examples
Examples
iex> init_fn = Axon.Initializers.normal()
iex> t = init_fn.({2, 2}, {:f, 32}, Nx.Random.key(1))
iex> Nx.shape(t)
{2, 2}
iex> Nx.type(t)
{:f, 32}
iex> init_fn = Axon.Initializers.normal(mean: 1.0, scale: 1.0)
iex> t = init_fn.({2, 2}, {:bf, 16}, Nx.Random.key(1))
iex> Nx.shape(t)
{2, 2}
iex> Nx.type(t)
{:bf, 16}
Initializes parameters to 1.
examples
Examples
iex> init_fn = Axon.Initializers.ones()
iex> out = init_fn.({2, 2}, {:f, 32})
iex> out
#Nx.Tensor<
f32[2][2]
[
[1.0, 1.0],
[1.0, 1.0]
]
>
Initializes a tensor with an orthogonal distribution.
For 2-D tensors, the initialization is generated through the QR decomposition of a random distribution
For tensors with more than 2 dimensions, a 2-D tensor with shape {shape[0] * shape[1] * ... * shape[n-2], shape[n-1]}
is initialized and then reshaped accordingly.
options
Options
:distribution
- output distribution. One of [:normal
,:uniform
]. Defaults to:normal
examples
Examples
iex> init_fn = Axon.Initializers.orthogonal()
iex> t = init_fn.({3, 3}, {:f, 32}, Nx.Random.key(1))
iex> Nx.type(t)
{:f, 32}
iex> Nx.shape(t)
{3, 3}
iex> init_fn = Axon.Initializers.orthogonal()
iex> t = init_fn.({1, 2, 3, 4}, {:f, 64}, Nx.Random.key(1))
iex> Nx.type(t)
{:f, 64}
iex> Nx.shape(t)
{1, 2, 3, 4}
Initializes parameters with a random uniform distribution.
options
Options
:scale
- scale of the output distribution. Defaults to1.0e-2
examples
Examples
iex> init_fn = Axon.Initializers.uniform()
iex> t = init_fn.({2, 2}, {:f, 32}, Nx.Random.key(1))
iex> Nx.shape(t)
{2, 2}
iex> Nx.type(t)
{:f, 32}
iex> init_fn = Axon.Initializers.uniform(scale: 1.0e-3)
iex> t = init_fn.({2, 2}, {:bf, 16}, Nx.Random.key(1))
iex> Nx.shape(t)
{2, 2}
iex> Nx.type(t)
{:bf, 16}
Initializes parameters with variance scaling according to the given distribution and mode.
Variance scaling adapts scale to the weights of the output tensor.
options
Options
:scale
- scale of the output distribution. Defaults to1.0e-2
:mode
- compute fan mode. One of:fan_in
,:fan_out
, or:fan_avg
. Defaults to:fan_in
:distribution
- output distribution. One of:normal
,:truncated_normal
, or:uniform
. Defaults to:normal
examples
Examples
iex> init_fn = Axon.Initializers.variance_scaling()
iex> t = init_fn.({2, 2}, {:f, 32}, Nx.Random.key(1))
iex> Nx.shape(t)
{2, 2}
iex> Nx.type(t)
{:f, 32}
iex> init_fn = Axon.Initializers.variance_scaling(mode: :fan_out, distribution: :truncated_normal)
iex> t = init_fn.({2, 2}, {:bf, 16}, Nx.Random.key(1))
iex> Nx.shape(t)
{2, 2}
iex> Nx.type(t)
{:bf, 16}
iex> init_fn = Axon.Initializers.variance_scaling(mode: :fan_out, distribution: :normal)
iex> t = init_fn.({64, 3, 32, 32}, {:f, 32}, Nx.Random.key(1))
iex> Nx.shape(t)
{64, 3, 32, 32}
iex> Nx.type(t)
{:f, 32}
Initializes parameters to 0.
examples
Examples
iex> init_fn = Axon.Initializers.zeros()
iex> out = init_fn.({2, 2}, {:f, 32})
iex> out
#Nx.Tensor<
f32[2][2]
[
[0.0, 0.0],
[0.0, 0.0]
]
>