View Source Ash.Type behaviour (ash v3.4.47)
The Ash.Type
behaviour is used to define a value type in Ash.
Built in types
:map
-Ash.Type.Map
:keyword
-Ash.Type.Keyword
:term
-Ash.Type.Term
:atom
-Ash.Type.Atom
:string
-Ash.Type.String
:integer
-Ash.Type.Integer
:file
-Ash.Type.File
:float
-Ash.Type.Float
:duration_name
-Ash.Type.DurationName
:function
-Ash.Type.Function
:boolean
-Ash.Type.Boolean
:struct
-Ash.Type.Struct
:uuid
-Ash.Type.UUID
:uuid_v7
-Ash.Type.UUIDv7
:binary
-Ash.Type.Binary
:date
-Ash.Type.Date
:time
-Ash.Type.Time
:decimal
-Ash.Type.Decimal
:ci_string
-Ash.Type.CiString
:naive_datetime
-Ash.Type.NaiveDatetime
:utc_datetime
-Ash.Type.UtcDatetime
:utc_datetime_usec
-Ash.Type.UtcDatetimeUsec
:datetime
-Ash.Type.DateTime
:url_encoded_binary
-Ash.Type.UrlEncodedBinary
:union
-Ash.Type.Union
:module
-Ash.Type.Module
:vector
-Ash.Type.Vector
Lists/Arrays
To specify a list of values, use {:array, Type}
. Arrays are special, and have special constraints:
:items
(term/0
) - Constraints for the elements of the list. See the contained type's docs for more.:min_length
(non_neg_integer/0
) - A minimum length for the items.:max_length
(non_neg_integer/0
) - A maximum length for the items.:nil_items?
(boolean/0
) - Whether or not the list can contain nil items. The default value isfalse
.:remove_nil_items?
(boolean/0
) - Whether or not to remove the nil items from the list instead of adding errors. The default value isfalse
.:empty_values
(list ofterm/0
) - A set of values that, if encountered, will be considered an empty list. The default value is[""]
.
Defining Custom Types
Generally you add use Ash.Type
to your module (it is possible to add @behaviour Ash.Type
and define everything yourself, but this is more work and error-prone).
Overriding the {:array, type}
behaviour. By defining the *_array
versions
of cast_input
, cast_stored
, dump_to_native
and apply_constraints
, you can
override how your type behaves as a collection. This is how the features of embedded
resources are implemented. No need to implement them unless you wish to override the
default behaviour. Your type is responsible for handling nil values in each callback as well.
Simple example of a float custom type
defmodule GenTracker.AshFloat do
use Ash.Type
@impl Ash.Type
def storage_type(_), do: :float
@impl Ash.Type
def cast_input(nil, _), do: {:ok, nil}
def cast_input(value, _) do
Ecto.Type.cast(:float, value)
end
@impl Ash.Type
def cast_stored(nil, _), do: {:ok, nil}
def cast_stored(value, _) do
Ecto.Type.load(:float, value)
end
@impl Ash.Type
def dump_to_native(nil, _), do: {:ok, nil}
def dump_to_native(value, _) do
Ecto.Type.dump(:float, value)
end
end
All the Ash built-in types are implemented with use Ash.Type
so they are good
examples to look at to create your own Ash.Type
.
Short names
You can define short :atom_names
for your custom types by adding them to your Ash configuration:
config :ash, :custom_types, [ash_float: GenTracker.AshFloat]
Doing this will require a recompilation of the :ash
dependency which can be triggered by calling:
$ mix deps.compile ash --force
Composite Types
Composite types are composite in the data layer. Many data layers do not support this, but some (like AshPostgres), do. To define a composite type, the following things should be true:
- A casted value should be a map or struct, for example for a point:
%{x: 1, y: 2}
- The data layer must support composite types, and the data layer representation will be a tuple, i.e
{1, 2}
- Define
def composite?(_), do: true
in your composite type - Define the type & constraints of each item in the tuple, and its name in the map
representation:
def composite_types(_), do: [{:x, :integer, []}, {:y, :integer, []}]
. You can also define a storage key for each item in the tuple, if the underlying type implementation has a different reference for an item, i.edef composite_types(_), do: [{:x, :x_coord, :integer, []}, {:y, :y_coord, :integer, []}]
With the above implemented, your composite type can be used in expressions, for example:
Ash.Query.filter(expr(coordinates[:x] == 1))
And you can also construct composite types in expressions, for example:
calculate :coordinates, :composite_point, expr(
composite_type(%{x: some_value, y: some_other_value}, Point)
)
Constraints
Constraints are a way of validating an input type. This validation can be used in both attributes and arguments. The kinds of constraints you can apply depends on the type of data. You can find all types in Ash.Type
. Each type has its own page on which the available constraints are listed. For example in Ash.Type.String
you can find 5 constraints:
:max_length
:min_length
:match
:trim?
:allow_empty?
You can also discover these constraints from iex:
$ iex -S mix
iex(1)> Ash.Type.String.constraints
[
max_length: [
type: :non_neg_integer,
doc: "Enforces a maximum length on the value"
],
min_length: [
type: :non_neg_integer,
doc: "Enforces a minimum length on the value"
],
match: [
type: {:custom, Ash.Type.String, :match, []},
doc: "Enforces that the string matches a passed in regex"
],
trim?: [type: :boolean, doc: "Trims the value.", default: true],
allow_empty?: [
type: :boolean,
doc: "If false, the value is set to `nil` if it's empty.",
default: false
]
]
Attribute example
To show how constraints can be used in a attribute, here is an example attribute describing a username:
defmodule MyProject.MyDomain.Account do
# ...
code_interface do
define :create, action: :create
end
actions do
default [:create, :read, :update, :destroy]
end
attributes do
uuid_primary_key :id
attribute :username, :string do
constraints [
max_length: 20,
min_length: 3,
match: ~r/^[a-z_-]*$/,
trim?: true,
allow_empty?: false
]
end
end
# ...
end
If, when creating or updating this attribute, one of the constraints are not met, an error will be given telling you which constraint was broken. See below:
iex(1)> MyProject.MyDomain.Account.create!(%{username: "hi"})
** (Ash.Error.Invalid) Invalid Error
* Invalid value provided for username: length must be greater than or equal to 3.
"hi"
iex(2)> MyProject.MyDomain.Account.create!(%{username: "Hello there this is a long string"})
** (Ash.Error.Invalid) Invalid Error
* Invalid value provided for username: length must be less than or equal to 20.
"Hello there this is a long string"
iex(3)> MyProject.MyDomain.Account.create!(%{username: "hello there"})
** (Ash.Error.Invalid) Invalid Error
* Invalid value provided for username: must match the pattern ~r/^[a-z_-]*$/.
"hello there"
iex(4)> MyProject.MyDomain.Account.create!(%{username: ""})
** (Ash.Error.Invalid) Invalid Error
* attribute title is required
It will give you the resource as usual on successful requests:
iex(5)> MyProject.MyDomain.Account.create!(%{username: "hello"})
#MyProject.MyDomain.Account<
__meta__: #Ecto.Schema.Metadata<:loaded, "account">,
id: "7ba467dd-277c-4916-88ae-f62c93fee7a3",
username: "hello",
...
>
Summary
Types
A keyword list of constraints for a type
An error value that can be returned from various callbacks
The context that is provided to the load/4
callback.
The context that is provided to the merge_load/4
callback.
A valid Ash.Type
Callbacks
Applies type constraints within an expression.
Applies type constraints to a list of values within an expression. See apply_atomic_constraints/2
for more.
Called after casting, to apply additional constraints to the value.
Called after casting a list of values, to apply additional constraints to the value.
Returns a Spark.Options
spec for the additional constraints supported when used in a list.
Whether or not load/4
can be used. Defined automatically
Casts a value within an expression.
Casts a list of values within an expression. See cast_atomic/2
for more.
Whether or not data layers that build queries should attempt to type cast values of this type while doing so.
Attempt to cast unknown, potentially user-provided input, into a valid instance of the type.
Attempt to cast a list of unknown, potentially user-provided inputs, into a list of valid instances of type.
Attempt to load a stored value from the data layer into a valid instance of the type.
Attempt to load a list of stored values from the data layer into a list of valid instances of the type.
Return true if the type is a composite type, meaning it is made up of one or more values. How this works is up to the data layer.
Information about each member of the composite type, if it is a composite type
Returns a Spark.Options
spec for the constraints supported by the type.
Whether or not an apply_constraints_array/2
callback has been defined. This is defined automatically.
Describes a type given its constraints. Can be used to generate docs, for example.
Transform a valid instance of the type into a format that can be JSON encoded.
Transform a list of valid instances of the type into a format that can be JSON encoded.
Transform a valid instance of the type into a format that the data layer can store.
Transform a list of valid instance of the type into a format that the data layer can store.
The underlying Ecto.Type.
Whether or not the type is an embedded resource. This is defined by embedded resources, you should not define this.
Determine if two valid instances of the type are equal.
The implementation for any overloaded implementations.
An Enumerable that produces valid instances of the type.
Gets any "rewrites" necessary to apply a given load statement.
React to a changing value. This could be used, for example, to have a type like :strictly_increasing_integer
.
React to a changing list of values. This could be used, for example, to have a type like :unique_integer
, which when used in a list all items must be unique.
Whether or not a custom handle_change_array/3
has been defined by the type. Defined automatically.
Add the source changeset to the constraints, in cases where it is needed for type casting logic
Useful for typed data layers (like ash_postgres) to instruct them not to attempt to cast input values.
Applies a load statement through a list of values.
Checks if the given path has been loaded on the type.
Whether or not the value a valid instance of the type.
Merges a load statement with an existing load statement for the type.
A map of operators with overloaded implementations.
Prepare a change, given the old value and the new uncasted value.
Prepare a changing list of values, given the old value and the new uncasted value.
Whether or not a custom prepare_change_array/3
has been defined by the type. Defined automatically.
Apply any "rewrites" necessary to provide the results of a load statement to calculations that depended on a given load.
Whether or not ==
can be used to compare instances of the type.
The storage type, which should be known by a data layer supporting this type.
The storage type, which should be known by a data layer supporting this type.
Functions
Confirms if a casted value matches the provided constraints.
Gets the array constraints for a type
Returns true if the value is a builtin type or adopts the Ash.Type
behaviour
Returns true if the type is an ash builtin type
Returns true if the type supports nested loads
Returns true
if the type should be cast in underlying queries
Casts input (e.g. unknown) data to an instance of the type, or errors
Casts a value from the data store to an instance of the type, or errors
Returns true if the type is a composite type
Returns the wrapped composite types
Returns the constraint schema for a type
Calls the type's describe
function with the given constraints
Determine types for a given function or operator.
Casts a value from the Elixir type to a value that can be embedded in another data structure.
Casts a value from the Elixir type to a value that the data store can persist
Returns the ecto compatible type for an Ash.Type.
Returns true if the type is an embedded resource
Determines if two values of a given type are equal.
Returns the StreamData generator for a given type
Gets the load rewrites for a given type, load, calculation and path.
Gets the type module for a given short name or module
Gets the type module for a given short name or module,
ensures that it is a valid type
Process the old casted values alongside the new casted values.
Handles the change of a given array of values for an attribute change. Runs after casting.
Initializes the constraints according to the underlying type
Detects as a best effort if an arbitrary value matches the given type
Process the old casted values alongside the new uncasted values.
Prepares a given array of values for an attribute change. Runs before casting.
Applies rewrites to a given value.
Returns the list of available type short names
Determines if a type can be compared using the ==
operator.
Returns the underlying storage type (the underlying type of the ecto type of the ash type)
Types
@type constraints() :: Keyword.t()
A keyword list of constraints for a type
@type error() :: :error | {:error, String.t() | [field: atom(), fields: [atom()], message: String.t(), value: any()] | Ash.Error.t()}
An error value that can be returned from various callbacks
@type load_context() :: %{ domain: Ash.Domain.t(), actor: term() | nil, tenant: term(), tracer: [Ash.Tracer.t()] | Ash.Tracer.t() | nil, authorize?: boolean() | nil }
The context that is provided to the load/4
callback.
@type merge_load_context() :: %{ domain: Ash.Domain.t(), calc_name: term(), calc_load: term(), calc_path: [atom()], reuse_values?: boolean(), strict_loads?: boolean(), initial_data: term(), relationship_path: [atom()], authorize?: boolean() }
The context that is provided to the merge_load/4
callback.
A valid Ash.Type
Callbacks
@callback apply_atomic_constraints(new_value :: Ash.Expr.t(), constraints()) :: :ok | {:ok, Ash.Expr.t()} | {:error, Ash.Error.t()}
Applies type constraints within an expression.
@callback apply_atomic_constraints_array(new_value :: Ash.Expr.t(), constraints()) :: :ok | {:ok, Ash.Expr.t()} | {:error, Ash.Error.t()}
Applies type constraints to a list of values within an expression. See apply_atomic_constraints/2
for more.
@callback apply_constraints(term(), constraints()) :: {:ok, new_value :: term()} | :ok | error()
Called after casting, to apply additional constraints to the value.
@callback apply_constraints_array([term()], constraints()) :: {:ok, new_values :: [term()]} | :ok | error()
Called after casting a list of values, to apply additional constraints to the value.
If not defined, apply_constraints/2
is called for each item.
@callback array_constraints() :: constraints()
Returns a Spark.Options
spec for the additional constraints supported when used in a list.
@callback can_load?(constraints()) :: boolean()
Whether or not load/4
can be used. Defined automatically
@callback cast_atomic(new_value :: Ash.Expr.t(), constraints()) :: {:atomic, Ash.Expr.t()} | {:error, Ash.Error.t()} | {:not_atomic, String.t()}
Casts a value within an expression.
For instance, if you had a type like :non_neg_integer
, you might do:
def cast_atomic(value, _constraints) do
expr(
if ^value < 0 do
error(Ash.Error.Changes.InvalidChanges, %{message: "must be positive", value: ^value})
else
value
end
)
end
@callback cast_atomic_array(new_value :: Ash.Expr.t(), constraints()) :: {:atomic, Ash.Expr.t()} | {:error, Ash.Error.t()} | {:not_atomic, String.t()}
Casts a list of values within an expression. See cast_atomic/2
for more.
@callback cast_in_query?(constraints()) :: boolean()
Whether or not data layers that build queries should attempt to type cast values of this type while doing so.
@callback cast_input(term(), constraints()) :: {:ok, term()} | Ash.Error.t()
Attempt to cast unknown, potentially user-provided input, into a valid instance of the type.
@callback cast_input_array([term()], constraints()) :: {:ok, [term()]} | error()
Attempt to cast a list of unknown, potentially user-provided inputs, into a list of valid instances of type.
This callback allows to define types that are "collection-aware", i.e an integer that is unique whenever it appears in a list.
If not defined, cast_input/2
is called for each item.
@callback cast_stored(term(), constraints()) :: {:ok, term()} | error()
Attempt to load a stored value from the data layer into a valid instance of the type.
@callback cast_stored_array([term()], constraints()) :: {:ok, [term()]} | error()
Attempt to load a list of stored values from the data layer into a list of valid instances of the type.
If not defined, cast_stored/2
is called for each item.
@callback composite?(constraints()) :: boolean()
Return true if the type is a composite type, meaning it is made up of one or more values. How this works is up to the data layer.
For example, AshMoney
provides a type that is composite with a "currency" and an "amount".
@callback composite_types(constraints()) :: [ {name, type, constraints()} | {name, storage_key, type, constraints()} ] when name: atom(), type: t(), storage_key: atom()
Information about each member of the composite type, if it is a composite type
An example given the AshMoney
example listed above:
[{:currency, :string, []}, {:amount, :decimal, []}]
@callback constraints() :: constraints()
Returns a Spark.Options
spec for the constraints supported by the type.
@callback custom_apply_constraints_array?() :: boolean()
Whether or not an apply_constraints_array/2
callback has been defined. This is defined automatically.
@callback describe(constraints()) :: String.t() | nil
Describes a type given its constraints. Can be used to generate docs, for example.
@callback dump_to_embedded(term(), constraints()) :: {:ok, term()} | :error
Transform a valid instance of the type into a format that can be JSON encoded.
@callback dump_to_embedded_array([term()], constraints()) :: {:ok, term()} | error()
Transform a list of valid instances of the type into a format that can be JSON encoded.
If not defined, dump_to_embedded/2
is called for each item.
@callback dump_to_native(term(), constraints()) :: {:ok, term()} | error()
Transform a valid instance of the type into a format that the data layer can store.
@callback dump_to_native_array([term()], constraints()) :: {:ok, term()} | error()
Transform a list of valid instance of the type into a format that the data layer can store.
If not defined, dump_to_native/2
is called for each item.
@callback ecto_type() :: Ecto.Type.t()
The underlying Ecto.Type.
@callback embedded?() :: boolean()
Whether or not the type is an embedded resource. This is defined by embedded resources, you should not define this.
Determine if two valid instances of the type are equal.
Do not define this if ==
is sufficient for your type. See simple_equality?/0
for more.
The implementation for any overloaded implementations.
@callback generator(constraints()) :: Enumerable.t()
An Enumerable that produces valid instances of the type.
This can be used for property testing, or generating valid inputs for seeding.
Typically you would use StreamData
for this.
@callback get_rewrites( merged_load :: term(), calculation :: Ash.Query.Calculation.t(), path :: [atom()], constraints :: Keyword.t() ) :: [rewrite()]
Gets any "rewrites" necessary to apply a given load statement.
This is a low level tool used when types can contain instances of resources. You generally
should not need to know how this works. See Ash.Type.Union
and Ash.Type.Struct
for examples
if you are trying to write a similar type.
@callback handle_change(old_term :: term(), new_term :: term(), constraints()) :: {:ok, term()} | error()
React to a changing value. This could be used, for example, to have a type like :strictly_increasing_integer
.
@callback handle_change_array(old_term :: [term()], new_term :: [term()], constraints()) :: {:ok, term()} | error()
React to a changing list of values. This could be used, for example, to have a type like :unique_integer
, which when used in a list all items must be unique.
If not defined, handle_change/3
is called for each item with a nil
old value.
@callback handle_change_array?() :: boolean()
Whether or not a custom handle_change_array/3
has been defined by the type. Defined automatically.
@callback include_source(constraints(), Ash.Changeset.t()) :: constraints()
Add the source changeset to the constraints, in cases where it is needed for type casting logic
@callback init(constraints()) :: {:ok, constraints()} | {:error, Ash.Error.t()}
Useful for typed data layers (like ash_postgres) to instruct them not to attempt to cast input values.
You generally won't need this, but it can be an escape hatch for certain cases.
@callback load( values :: [term()], load :: Keyword.t(), constraints :: Keyword.t(), context :: load_context() ) :: {:ok, [term()]} | {:error, Ash.Error.t()}
Applies a load statement through a list of values.
This allows types to support load statements, like Ash.Type.Union
, embedded resources,
or the Ash.Type.Struct
when it is an instance_of
a resource.
@callback loaded?( value :: term(), path_to_load :: [atom()], constraints :: Keyword.t(), opts :: Keyword.t() ) :: boolean()
Checks if the given path has been loaded on the type.
@callback matches_type?(term(), constraints()) :: boolean()
Whether or not the value a valid instance of the type.
@callback merge_load( left :: term(), right :: term(), constraints :: Keyword.t(), context :: merge_load_context() | nil ) :: {:ok, term()} | {:error, error()} | :error
Merges a load statement with an existing load statement for the type.
A map of operators with overloaded implementations.
These will only be honored if the type is placed in config :ash, :known_types, [...Type]
A corresponding evaluate_operator/1
clause should match.
@callback prepare_change(old_term :: term(), new_uncasted_term :: term(), constraints()) :: {:ok, term()} | error()
Prepare a change, given the old value and the new uncasted value.
@callback prepare_change_array( old_term :: [term()], new_uncasted_term :: [term()], constraints() ) :: {:ok, term()} | error()
Prepare a changing list of values, given the old value and the new uncasted value.
If not defined, prepare_change/3
is called for each item with a nil
old value.
@callback prepare_change_array?() :: boolean()
Whether or not a custom prepare_change_array/3
has been defined by the type. Defined automatically.
Apply any "rewrites" necessary to provide the results of a load statement to calculations that depended on a given load.
This is a low level tool used when types can contain instances of resources. You generally
should not need to know how this works. See Ash.Type.Union
and Ash.Type.Struct
for examples
if you are trying to write a similar type.
@callback simple_equality?() :: boolean()
Whether or not ==
can be used to compare instances of the type.
This is defined automatically to return false
if equal?/2
is defined.
Types that cannot be compared using ==
incur significant runtime costs when used in certain ways.
For example, if a resource's primary key cannot be compared with ==
, we cannot do things like key
a list of records by their primary key. Implementing equal?/2
will cause various code paths to be considerably
slower, so only do it when necessary.
@callback storage_type() :: Ecto.Type.t()
The storage type, which should be known by a data layer supporting this type.
Use storage_type/1
, as this will be deprecated in the future.
@callback storage_type(constraints()) :: Ecto.Type.t()
The storage type, which should be known by a data layer supporting this type.
Functions
@spec apply_atomic_constraints(t(), term(), constraints()) :: {:ok, Ash.Expr.t()} | {:error, Ash.Error.t()}
@spec apply_constraints(t(), term(), constraints()) :: {:ok, term()} | {:error, String.t()}
Confirms if a casted value matches the provided constraints.
Gets the array constraints for a type
Returns true if the value is a builtin type or adopts the Ash.Type
behaviour
Returns true if the type is an ash builtin type
Returns true if the type supports nested loads
@spec cast_atomic(t(), term(), constraints()) :: {:atomic, Ash.Expr.t()} | {:ok, term()} | {:error, Ash.Error.t()} | {:not_atomic, String.t()}
Returns true
if the type should be cast in underlying queries
@spec cast_input(t(), term(), constraints() | nil) :: {:ok, term()} | {:error, Keyword.t()} | :error
Casts input (e.g. unknown) data to an instance of the type, or errors
Maps to Ecto.Type.cast/2
@spec cast_stored(t(), term(), constraints() | nil) :: {:ok, term()} | {:error, keyword()} | :error
Casts a value from the data store to an instance of the type, or errors
Maps to Ecto.Type.load/2
@spec composite?( t(), constraints() ) :: Enumerable.t()
Returns true if the type is a composite type
@spec composite_types( t(), constraints() ) :: Enumerable.t()
Returns the wrapped composite types
@spec constraints(t()) :: constraints()
Returns the constraint schema for a type
Calls the type's describe
function with the given constraints
Determine types for a given function or operator.
@spec dump_to_embedded(t(), term(), constraints() | nil) :: {:ok, term()} | {:error, keyword()} | :error
Casts a value from the Elixir type to a value that can be embedded in another data structure.
Embedded resources expect to be stored in JSON, so this allows things like UUIDs to be stored as strings in embedded resources instead of binary.
@spec dump_to_native(t(), term(), constraints() | nil) :: {:ok, term()} | {:error, keyword()} | :error
Casts a value from the Elixir type to a value that the data store can persist
Maps to Ecto.Type.dump/2
@spec ecto_type(t()) :: Ecto.Type.t()
Returns the ecto compatible type for an Ash.Type.
If you use Ash.Type
, this is created for you. For builtin types
this may return a corresponding ecto builtin type (atom)
Returns true if the type is an embedded resource
Determines if two values of a given type are equal.
Maps to Ecto.Type.equal?/3
@spec generator( module() | {:array, module()}, constraints() ) :: Enumerable.t()
Returns the StreamData generator for a given type
Gets the load rewrites for a given type, load, calculation and path.
This is used for defining types that support a nested load statement. See the embedded type and union type implementations for examples of how to use this.
@spec get_type(atom() | module() | {:array, atom() | module()}) :: atom() | module() | {:array, atom() | module()}
Gets the type module for a given short name or module
@spec get_type!(atom() | module() | {:array, atom() | module()}) :: atom() | module() | {:array, atom() | module()}
Gets the type module for a given short name or module,
ensures that it is a valid type
Raises
RuntimeError
: If the provided type module is not found or invalid.
Process the old casted values alongside the new casted values.
This is leveraged by embedded types to know if something is being updated or destroyed. This is not called on creates.
Handles the change of a given array of values for an attribute change. Runs after casting.
@spec include_source( t(), Ash.Changeset.t() | Ash.Query.t() | Ash.ActionInput.t(), constraints() ) :: constraints()
@spec init(t(), constraints()) :: {:ok, constraints()} | {:error, Ash.Error.t()}
Initializes the constraints according to the underlying type
@spec load( type :: t(), values :: [term()], load :: Keyword.t(), constraints :: Keyword.t(), context :: load_context() ) :: {:ok, [term()]} | {:error, Ash.Error.t()}
Detects as a best effort if an arbitrary value matches the given type
@spec merge_load( type :: t(), left :: term(), right :: term(), constraints :: Keyword.t(), context :: merge_load_context() | nil ) :: {:ok, [term()]} | :error | {:error, Ash.Error.t()}
Process the old casted values alongside the new uncasted values.
This is leveraged by embedded types to know if something is being updated or destroyed. This is not called on creates.
Prepares a given array of values for an attribute change. Runs before casting.
Applies rewrites to a given value.
This is used for defining types that support a nested load statement. See the embedded type and union type implementations for examples of how to use this.
Returns the list of available type short names
Determines if a type can be compared using the ==
operator.
Returns the underlying storage type (the underlying type of the ecto type of the ash type)