Geminix.V1beta.Hyperparameters (geminix v0.2.0)

Hyperparameters controlling the tuning process. Read more at https://ai.google.dev/docs/model_tuning_guidance

Fields:

  • :batch_size (integer/0) - Immutable. The batch size hyperparameter for tuning. If not set, a default of 4 or 16 will be used based on the number of training examples.
  • :epoch_count (integer/0) - Immutable. The number of training epochs. An epoch is one pass through the training data. If not set, a default of 5 will be used.
  • :learning_rate (number/0) - Optional. Immutable. The learning rate hyperparameter for tuning. If not set, a default of 0.001 or 0.0002 will be calculated based on the number of training examples.
  • :learning_rate_multiplier (number/0) - Optional. Immutable. The learning rate multiplier is used to calculate a final learning_rate based on the default (recommended) value. Actual learning rate := learning_rate_multiplier * default learning rate Default learning rate is dependent on base model and dataset size. If not set, a default of 1.0 will be used.

Summary

Types

t()

@type t() :: %Geminix.V1beta.Hyperparameters{
  __meta__: term(),
  batch_size: integer(),
  epoch_count: integer(),
  learning_rate: number(),
  learning_rate_multiplier: number()
}

Functions

from_map(schema \\ %__MODULE__{}, map)

@spec from_map(t(), map()) :: {:ok, t()} | {:error, Ecto.Changeset.t()}

Create a Geminix.V1beta.Hyperparameters.t/0 from a map returned by the Gemini API.

Sometimes, this function should not be applied to the full response body, but instead it should be applied to the correct part of the map in the response body. This depends on the concrete API call.