Annex v0.2.1 Annex View Source
Annex is a library for composing and running deep artificial
Link to this section Summary
Functions
Given an Activation's name returns appropriate Activation layer.
Given a number of rows and columns returns a Dense layer.
Given a number of rows, columns, some weights,
and some biases returns a built Dense layer.
Given a frequency (between 0.0 and 1.0) returns a LayerConfig for a Dropout.
The Dropout layer randomly, at a given frequency, returns 0.0 for an input
regardless of that input's value.
Given an initialized Learner learner and some data returns a prediction.
Given a list of layers returns a LayerConfig for a Sequence.
Trains an Annex.Learner given learner, data, labels, and options.
Link to this section Functions
activation(name)
View Source
activation(Annex.Layer.Activation.func_name()) ::
Annex.LayerConfig.t(Annex.Layer.Activation)
activation(Annex.Layer.Activation.func_name()) :: Annex.LayerConfig.t(Annex.Layer.Activation)
Given an Activation's name returns appropriate Activation layer.
dense(rows, columns)
View Source
dense(pos_integer(), pos_integer()) :: Annex.LayerConfig.t(Annex.Layer.Dense)
dense(pos_integer(), pos_integer()) :: Annex.LayerConfig.t(Annex.Layer.Dense)
Given a number of rows and columns returns a Dense layer.
Without the weights and biases of dense/4 this Dense layer will be
have no neurons. Upon Layer.init_layer/2 the Dense layer will be
initialized with random neurons; Neurons with random weights and biases.
dense(rows, columns, weights, biases)
View Source
dense(pos_integer(), pos_integer(), Annex.Data.data(), Annex.Data.data()) ::
Annex.LayerConfig.t(Annex.Layer.Dense)
dense(pos_integer(), pos_integer(), Annex.Data.data(), Annex.Data.data()) :: Annex.LayerConfig.t(Annex.Layer.Dense)
Given a number of rows, columns, some weights,
and some biases returns a built Dense layer.
dropout(frequency)
View Source
dropout(float()) :: Annex.LayerConfig.t(Annex.Layer.Dropout)
dropout(float()) :: Annex.LayerConfig.t(Annex.Layer.Dropout)
Given a frequency (between 0.0 and 1.0) returns a LayerConfig for a Dropout.
The Dropout layer randomly, at a given frequency, returns 0.0 for an input
regardless of that input's value.
predict(learner, data)
View Source
predict(Annex.Learner.t(), Annex.Learner.data()) :: Annex.Learner.data()
predict(Annex.Learner.t(), Annex.Learner.data()) :: Annex.Learner.data()
Given an initialized Learner learner and some data returns a prediction.
The learner should be initialized with Learner.init_learner before being
used with the predict/2 function.
Also, it's a good idea to train the learner (using train/3 or train/4)
before using it to make predicitons. Chances are slim that an untrained
Learner is capable of making accurate predictions.
sequence(layers)
View Source
sequence([Annex.LayerConfig.t(module())]) ::
Annex.LayerConfig.t(Annex.Layer.Sequence)
sequence([Annex.LayerConfig.t(module())]) :: Annex.LayerConfig.t(Annex.Layer.Sequence)
Given a list of layers returns a LayerConfig for a Sequence.
train(learner, dataset, options \\ []) View Source
Trains an Annex.Learner given learner, data, labels, and options.
The learner should be initialized Learner.init_learner/2 before being
trained.
Returns the trained learner along with some measure of loss or performance.