Spark.Dsl.Extension behaviour (spark v2.7.0)

Copy Markdown View Source

An extension to the Spark DSL.

This allows configuring custom DSL components, whose configurations can then be read back. This guide is still a work in progress, but should serve as a decent example of what is possible. Open issues on Github if you have any issues/something is unclear.

The example at the bottom shows how you might build a (not very contextually relevant) DSL extension that would be used like so:

defmodule MyApp.Vehicle do
  use Spark.Dsl
end

defmodule MyApp.MyResource do
  use MyApp.Vehicle,
    extensions: [MyApp.CarExtension]

  cars do
    car :ford, :focus, trim: :sedan
    car :toyota, :corolla
  end
end

The extension:

defmodule MyApp.CarExtension do
  @car_schema [
    make: [
      type: :atom,
      required: true,
      doc: "The make of the car"
    ],
    model: [
      type: :atom,
      required: true,
      doc: "The model of the car"
    ],
    type: [
      type: :atom,
      required: true,
      doc: "The type of the car",
      default: :sedan
    ]
  ]

  @car %Spark.Dsl.Entity{
    name: :car,
    describe: "Adds a car",
    examples: [
      "car :ford, :focus"
    ],
    target: MyApp.Car,
    args: [:make, :model],
    schema: @car_schema
  }

  @cars %Spark.Dsl.Section{
    name: :cars, # The DSL constructor will be `cars`
    describe: """
    Configure what cars are available.

    More, deeper explanation. Always have a short one liner explanation,
    an empty line, and then a longer explanation.
    """,
    entities: [
      @car # See `Spark.Dsl.Entity` docs
    ],
    schema: [
      default_manufacturer: [
        type: :atom,
        doc: "The default manufacturer"
      ]
    ]
  }

  use Spark.Dsl.Extension, sections: [@cars]
end

Often, we will need to do complex validation/validate based on the configuration of other resources. Due to the nature of building compile time DSLs, there are many restrictions around that process. To support these complex use cases, extensions can include transformers, persisters, and verifiers. These run in the following order:

  1. Transformers — run during compilation, in dependency order (controlled by before?/1 and after?/1). Can read and modify any part of the DSL state. See Spark.Dsl.Transformer.

  2. Persisters — run during compilation, always after all transformers. They implement the same Spark.Dsl.Transformer behaviour but are declared under persisters:. By convention they should only write to the persisted data map via Spark.Dsl.Transformer.persist/3. They support before?/after? ordering relative to other persisters, but any ordering declarations targeting transformers are silently ignored.

  3. Verifiers — run after the module is compiled. Read-only. Do not create compile-time dependencies between modules, so they are safe to use when referencing other Spark-based modules. See Spark.Dsl.Verifier.

All three are provided as options to use:

use Spark.Dsl.Extension,
  sections: [@cars],
  transformers: [MyApp.Transformers.ValidateNoOverlappingMakesAndModels],
  persisters: [MyApp.Persisters.CacheCarCount],
  verifiers: [MyApp.Verifiers.CheckManufacturerExists]

By default, the generated modules will have names like __MODULE__.SectionName.EntityName, and that could potentially conflict with modules you are defining, so you can specify the module_prefix option, which would allow you to prefix the modules with something like __MODULE__.Dsl, so that the module path generated might be something like __MODULE__.Dsl.SectionName.EntityName, and you could then have the entity struct be __MODULE__.SectionName.EntityName without conflicts.

To expose the configuration of your DSL, define functions that use the helpers like get_entities/2 and get_opt/3. For example:

defmodule MyApp.Cars do
  def cars(resource) do
    Spark.Dsl.Extension.get_entities(resource, [:cars])
  end
end

MyApp.Cars.cars(MyResource)
# [%MyApp.Car{...}, %MyApp.Car{...}]

See the documentation for Spark.Dsl.Section and Spark.Dsl.Entity for more information

Summary

Functions

Fetch a value that was persisted while transforming or compiling the resource, e.g :primary_key

Get the entities configured for a given section

Get an option value for a section at a given path.

Get the annotation for a specific option in a section.

Get a value that was persisted while transforming or compiling the resource, e.g :primary_key

Get the annotation for a section at the given path.

Validates and transforms an entity structure, ensuring nested entities are properly formatted.

Types

t()

@type t() :: module()

Callbacks

add_extensions()

@callback add_extensions() :: [module()]

explain(map)

(optional)
@callback explain(map()) :: String.t() | nil

module_imports()

@callback module_imports() :: [module()]

persisters()

@callback persisters() :: [module()]

sections()

@callback sections() :: [Spark.Dsl.section()]

transformers()

@callback transformers() :: [module()]

verifiers()

@callback verifiers() :: [module()]

Functions

default_section_config()

@spec default_section_config() :: %{
  section_anno: :erl_anno.anno() | nil,
  entities: list(),
  opts: Keyword.t(),
  opts_anno: Keyword.t(:erl_anno.anno() | nil)
}

doc(sections, depth \\ 1)

See Spark.CheatSheet.doc/2.

doc_index(sections, depth \\ 1)

See Spark.CheatSheet.doc_index/2.

expand_alias(ast, env)

expand_alias_no_require(ast, env)

expand_literals(ast, acc, fun)

fetch_opt(resource, path, value, configurable? \\ false)

fetch_persisted(map, key)

Fetch a value that was persisted while transforming or compiling the resource, e.g :primary_key

get_entities(map, path)

Get the entities configured for a given section

get_entity_dsl_patches(extensions, section_path)

get_opt(resource, path, value, default \\ nil, configurable? \\ false)

Get an option value for a section at a given path.

Checks to see if it has been overridden via configuration.

get_opt_anno(dsl_state, path, opt_name)

@spec get_opt_anno(map() | module(), atom() | [atom()], atom()) ::
  :erl_anno.anno() | nil

Get the annotation for a specific option in a section.

get_opt_config(resource, path, value)

get_persisted(resource, key, default \\ nil)

Get a value that was persisted while transforming or compiling the resource, e.g :primary_key

get_recursive_entities_for_path(sections, list)

get_section_anno(dsl_state, path)

@spec get_section_anno(map() | module(), atom() | [atom()]) :: :erl_anno.anno() | nil

Get the annotation for a section at the given path.

macro_env_anno(env, do_block)

@spec macro_env_anno(env :: Macro.Env.t(), do_block :: Macro.t()) :: :erl_anno.anno()

module_concat(values)

monotonic_number(key)

run_transformers(mod, transformers, spark_dsl_config, env)

set_docs(items)

shuffle_opts_to_end(keyword, entity_args, schema, entities, opts)

spark_function_info(arg1)

validate_and_transform_dsl_patches(dsl_patches, module \\ nil)

validate_and_transform_entity(entity, path \\ [], module \\ nil)

Validates and transforms an entity structure, ensuring nested entities are properly formatted.

This function recursively processes a DSL entity and its nested entities, converting single entity values to lists where needed and validating the structure.

Parameters

  • entity - The entity to validate and transform
  • path - The current path in the DSL structure (for error reporting)
  • module - The module context (for error reporting)

Returns

Returns the transformed entity with normalized nested entity structures.