Tinkex.Types.AdamParams (Tinkex v0.3.4)

View Source

Adam optimizer parameters.

Mirrors Python tinker.types.AdamParams.

IMPORTANT: Defaults match Python SDK exactly:

  • learning_rate: 0.0001
  • beta1: 0.9
  • beta2: 0.95 (NOT 0.999!)
  • eps: 1.0e-12 (NOT 1e-8!)
  • weight_decay: 0.0 (decoupled weight decay)
  • grad_clip_norm: 0.0 (0 = no clipping)

Summary

Functions

Create AdamParams with validation.

Types

t()

@type t() :: %Tinkex.Types.AdamParams{
  beta1: float(),
  beta2: float(),
  eps: float(),
  grad_clip_norm: float(),
  learning_rate: float(),
  weight_decay: float()
}

Functions

new(opts \\ [])

@spec new(keyword()) :: {:ok, t()} | {:error, String.t()}

Create AdamParams with validation.