Sinusoidal Positional Encoding.
The original positional encoding from "Attention Is All You Need", using sine and cosine functions at different frequencies to encode absolute position information. Deterministic (no learned parameters).
Formula
PE(pos, 2i) = sin(pos / 10000^(2i/d))
PE(pos, 2i+1) = cos(pos / 10000^(2i/d))Usage
# As an Axon layer that adds PE to input
encoded = SinusoidalPE.layer(input, dim: 256)
# Precompute PE table
pe_table = SinusoidalPE.build_table(max_len: 512, dim: 256)References
- "Attention Is All You Need" (Vaswani et al., 2017)
- https://arxiv.org/abs/1706.03762
Summary
Functions
Build a sinusoidal positional encoding table.
Build an Axon layer that adds sinusoidal positional encoding to the input.
Functions
@spec build_table(keyword()) :: Nx.Tensor.t()
Build a sinusoidal positional encoding table.
Returns a tensor of shape [max_len, dim] with precomputed PE values.
Options
:max_len- Maximum sequence length (default: 512):dim- Embedding dimension (required)
Build an Axon layer that adds sinusoidal positional encoding to the input.
Options
:dim- Feature dimension (required):name- Layer name prefix (default: "sinusoidal_pe")