Annex v0.1.0 Annex.Layer.Activation View Source

Link to this section Summary

Link to this section Types

Link to this type

func_type() View Source
func_type() :: :float | :list

Link to this type

t() View Source
t() :: %Annex.Layer.Activation{
  activator: (number() -> number()),
  derivative: (number() -> number()),
  func_type: func_type(),
  name: atom(),
  output: term()
}

Link to this section Functions

Link to this function

backprop(layer, backprops) View Source

Callback implementation for Annex.Layer.backprop/2.

Link to this function

build(name) View Source
build(:relu | :sigmoid | :tanh | {:relu, number()}) :: Annex.Activation.t()

Link to this function

encoder() View Source
encoder() :: Annex.Data

Callback implementation for Annex.Layer.encoder/0.

Link to this function

feedforward(layer, inputs) View Source
feedforward(t(), [float()]) :: {[float()], t()}

Callback implementation for Annex.Layer.feedforward/2.

Link to this function

generate_outputs(act, inputs) View Source
generate_outputs(Annex.Activation.t(), [float()]) :: [any()]

Link to this function

get_activator(activation) View Source
get_activator(Annex.Layer.Activation.t()) :: (number() -> number())

Link to this function

get_derivative(activation) View Source
get_derivative(Annex.Layer.Activation.t()) :: (number() -> number())

Link to this function

init_layer(layer, opts) View Source
init_layer(t(), Keyword.t()) :: {:ok, t()}

Callback implementation for Annex.Layer.init_layer/2.

Link to this function

relu_deriv(x) View Source
relu_deriv(float()) :: float()

Link to this function

relu_deriv(x, threshold) View Source
relu_deriv(float(), float()) :: float()

Link to this function

relu_with_threshold(n, threshold) View Source
relu_with_threshold(float(), float()) :: float()

Link to this function

sigmoid_deriv(x) View Source
sigmoid_deriv(float()) :: float()

Link to this function

softmax(values) View Source
softmax([float()]) :: [float()]

Link to this function

tanh_deriv(x) View Source
tanh_deriv(float()) :: float()