View Source Spark.Dsl.Entity (spark v1.1.55)

Declares a DSL entity.

A dsl entity represents a dsl constructor who's resulting value is a struct. This lets the user create complex objects with arbitrary(mostly) validation rules.

The lifecycle of creating entities is complex, happening as Elixir is compiling the modules in question. Some of the patterns around validating/transforming entities have not yet solidified. If you aren't careful and don't follow the guidelines listed here, you can have subtle and strange bugs during compilation. Anything not isolated to simple value validations should be done in transformers. See Spark.Dsl.Transformer.

An entity has a target indicating which struct will ultimately be built. An entity also has a schema. This schema is used for documentation, and the options are validated against it before continuing on with the DSL.

To create positional arguments to the builder, use args. The values provided to args need to be in the provided schema as well. They will be positional arguments in the same order that they are provided in the args key.

auto_set_fields will set the provided values into the produced struct (they do not need to be included in the schema).

transform is a function that takes a created struct and can alter it. This happens immediately after handling the DSL options, and can be useful for setting field values on a struct based on other values in that struct. If you need things that aren't contained in that struct, use an Spark.Dsl.Transformer. This function returns {:ok, new_entity} or {:error, error}, so this can also be used to validate the entity.

entities allows you to specify a keyword list of nested entities. Nested entities are stored on the struct in the corresponding key, and are used in the same way entities are otherwise.

singleton_entity_keys specifies a set of entity keys (specified above) that should only have a single value. This will be validated and unwrapped into nil | single_value on success.

identifier expresses that a given entity is unique by that field, validated by the DSL.

Example

@my_entity %Spark.Dsl.Entity{
  name: :my_entity,
  target: MyStruct,
  schema: [my_field: [type: :atom, required: false]]
}

Once compiled by Spark, entities can be invoked with a keyword list:

my_entity my_field: :value

Or with a do block:

my_entity do
  my_field :value
end

For a full example, see Spark.Dsl.Extension.

Summary

Types

Specifies positional arguments for an Entity.

Set the provided key value pairs in the produced struct. These fields do not need to be included in the Entity's schema.

User provided documentation.

Internal field. Not set by user.

A keyword list of nested entities.

t()

Defines the struct that will be built from this entity definition.

Specifies a function that will run on the target struct after building.

Types

@type args() :: [atom() | {:optional, atom()} | {:optional, atom(), any()}]

Specifies positional arguments for an Entity.

An entity declared like this:

@entity %Spark.Dsl.Entity{
  name: :entity,
  target: Entity,
  schema: [
    positional: [type: :atom, required: true],
    other: [type: :atom, required: false],
  ],
  args: [:positional]
}

Can be instantiated like this:

entity :positional_argument do
  other :other_argument
end
@type auto_set_fields() :: Keyword.t(any())

Set the provided key value pairs in the produced struct. These fields do not need to be included in the Entity's schema.

@type deprecations() :: Keyword.t(String.t())
@type describe() :: String.t()

User provided documentation.

Documentation provided in a Entity's describe field will be included by Spark in any generated documentation that includes the Entity.

@type docs() :: String.t()

Internal field. Not set by user.

@type entities() :: Keyword.t(t())

A keyword list of nested entities.

@type examples() :: [String.t()]
@type hide() :: [atom()]
@type id() :: term()
@type imports() :: [module()]
@type links() :: Keyword.t([String.t()]) | nil
@type modules() :: [atom()]
@type name() :: atom() | nil
@type no_depend_modules() :: [atom()]
@type recursive_as() :: atom() | nil
Link to this type

singleton_entity_keys()

View Source
@type singleton_entity_keys() :: [atom()]
@type snippet() :: String.t()
@type t() :: %Spark.Dsl.Entity{
  args: args(),
  auto_set_fields: auto_set_fields(),
  deprecations: deprecations(),
  describe: describe(),
  docs: docs(),
  entities: entities(),
  examples: examples(),
  hide: hide(),
  identifier: id(),
  imports: imports(),
  links: links(),
  modules: modules(),
  name: name(),
  no_depend_modules: no_depend_modules(),
  recursive_as: recursive_as(),
  schema: Spark.OptionsHelpers.schema(),
  singleton_entity_keys: singleton_entity_keys(),
  snippet: snippet(),
  target: target(),
  transform: transform()
}
@type target() :: module() | nil

Defines the struct that will be built from this entity definition.

The struct will need to have fields for all entities, t:schema/0 fields, and auto_set_fields/0.

@type transform() :: {module(), function :: atom(), args :: [any()]} | nil

Specifies a function that will run on the target struct after building.

@my_entity %Spark.Dsl.Entity{
  name: :my_entity,
  target: MyEntity,
  schema: [
    my_field: [type: :list, required: true]
  ],
  transform: {MyModule, :max_three_items, []}
}

def max_three_items(my_entity) do
  if length(my_entity.my_field) > 3 do
    {:error, "Can't have more than three items"}
  else
    {:ok, my_entity}
  end
end