# `Edifice.Neuromorphic.ANN2SNN`
[🔗](https://github.com/blasphemetheus/edifice/blob/main/lib/edifice/neuromorphic/ann2snn.ex#L1)

ANN-to-SNN Conversion via Rate Coding.

Provides a bridge between conventional artificial neural networks (ANNs)
and spiking neural networks (SNNs). The ANN is trained with standard
backpropagation, then converted to an SNN by replacing ReLU activations
with integrate-and-fire neurons that encode activation magnitudes as
spike rates.

## Conversion Principle

A ReLU neuron with activation `a` can be approximated by a spiking neuron
that fires at rate `a / threshold` over `num_timesteps`. The key insight
is that the time-averaged spike rate of an IF neuron converges to the
ReLU activation as timesteps increase.

## Architecture

```
ANN Mode:                          SNN Mode:
Input [batch, input_size]          Input [batch, input_size]
      |                                  |
      v                                  v
Dense + ReLU                       Dense (same weights)
      |                                  |
      v                                  v
Dense + ReLU                       IF Neuron (rate-coded)
      |                                  |  (simulate num_timesteps)
      v                                  v
Output [batch, output_size]        Rate Decode -> Output
```

## Usage

    # Build the ANN (for training)
    ann = ANN2SNN.build(
      input_size: 256,
      hidden_sizes: [128, 64],
      num_timesteps: 10,
      threshold: 1.0
    )

    # Build the SNN (for inference on neuromorphic hardware)
    snn = ANN2SNN.build_snn(
      input_size: 256,
      hidden_sizes: [128, 64],
      num_timesteps: 10,
      threshold: 1.0
    )

## References

- Diehl et al., "Fast-Classifying, High-Accuracy Spiking Deep Networks
  Through Weight and Threshold Balancing" (IJCNN 2015)
- Rueckauer et al., "Conversion of Continuous-Valued Deep Networks to
  Efficient Event-Driven Networks for Image Classification" (2017)

# `build_opt`

```elixir
@type build_opt() ::
  {:hidden_sizes, [pos_integer()]}
  | {:input_size, pos_integer()}
  | {:output_size, pos_integer()}
```

Options for `build/1`.

# `build`

```elixir
@spec build([build_opt()]) :: Axon.t()
```

Build the ANN version (for training with backpropagation).

This is a standard feedforward network with ReLU activations. After
training, the same weights can be used with `build_snn/1` for
spiking inference.

## Options

- `:input_size` - Input feature dimension (required)
- `:hidden_sizes` - List of hidden layer sizes (default: [256, 128])
- `:output_size` - Output dimension (default: last hidden size)
- `:num_timesteps` - Stored for SNN conversion reference (default: 10)
- `:threshold` - Spiking threshold for conversion (default: 1.0)

## Returns

An Axon model (standard ANN): `[batch, input_size]` -> `[batch, output_size]`

# `build_snn`

```elixir
@spec build_snn(keyword()) :: Axon.t()
```

Build the SNN version (for spiking inference).

Uses the same dense layer structure but replaces ReLU with
integrate-and-fire neuron simulation. Weights from a trained ANN
can be directly transferred.

## Options

Same as `build/1`.

## Returns

An Axon model (SNN via rate coding): `[batch, input_size]` -> `[batch, output_size]`

# `if_neuron`

```elixir
@spec if_neuron(Nx.Tensor.t(), Nx.Tensor.t(), pos_integer(), float()) :: Nx.Tensor.t()
```

Integrate-and-Fire neuron simulation for rate-coded SNN inference.

Simulates an IF neuron over multiple timesteps. The input current is
presented at each timestep, the membrane integrates, and spikes are
emitted when threshold is exceeded. The output is the average spike
rate, which approximates the ReLU activation.

## Parameters

- `membrane` - Initial membrane potential (zero)
- `input_current` - Weighted input current
- `num_timesteps` - Number of simulation steps
- `threshold` - Firing threshold

## Returns

Average spike rate (approximates ReLU output).

# `output_size`

```elixir
@spec output_size(keyword()) :: pos_integer()
```

Get the output size of an ANN2SNN model.

---

*Consult [api-reference.md](api-reference.md) for complete listing*
