View Source Spark.Dsl.Extension behaviour (spark v2.1.20)

An extension to the Spark DSL.

This allows configuring custom DSL components, whose configurations can then be read back. This guide is still a work in progress, but should serve as a decent example of what is possible. Open issues on Github if you have any issues/something is unclear.

The example at the bottom shows how you might build a (not very contextually relevant) DSL extension that would be used like so:

defmodule MyApp.Vehicle do
  use Spark.Dsl
end

defmodule MyApp.MyResource do
  use MyApp.Vehicle,
    extensions: [MyApp.CarExtension]

  cars do
    car :ford, :focus, trim: :sedan
    car :toyota, :corolla
  end
end

The extension:

defmodule MyApp.CarExtension do
  @car_schema [
    make: [
      type: :atom,
      required: true,
      doc: "The make of the car"
    ],
    model: [
      type: :atom,
      required: true,
      doc: "The model of the car"
    ],
    type: [
      type: :atom,
      required: true,
      doc: "The type of the car",
      default: :sedan
    ]
  ]

  @car %Spark.Dsl.Entity{
    name: :car,
    describe: "Adds a car",
    examples: [
      "car :ford, :focus"
    ],
    target: MyApp.Car,
    args: [:make, :model],
    schema: @car_schema
  }

  @cars %Spark.Dsl.Section{
    name: :cars, # The DSL constructor will be `cars`
    describe: """
    Configure what cars are available.

    More, deeper explanation. Always have a short one liner explanation,
    an empty line, and then a longer explanation.
    """,
    entities: [
      @car # See `Spark.Dsl.Entity` docs
    ],
    schema: [
      default_manufacturer: [
        type: :atom,
        doc: "The default manufacturer"
      ]
    ]
  }

  use Spark.Dsl.Extension, sections: [@cars]
end

Often, we will need to do complex validation/validate based on the configuration of other resources. Due to the nature of building compile time DSLs, there are many restrictions around that process. To support these complex use cases, extensions can include transformers which can validate/transform the DSL state after all basic sections/entities have been created. See Spark.Dsl.Transformer for more information. Transformers are provided as an option to use, like so:

use Spark.Dsl.Extension, sections: [@cars], transformers: [
  MyApp.Transformers.ValidateNoOverlappingMakesAndModels
]

By default, the generated modules will have names like __MODULE__.SectionName.EntityName, and that could potentially conflict with modules you are defining, so you can specify the module_prefix option, which would allow you to prefix the modules with something like __MODULE__.Dsl, so that the module path generated might be something like __MODULE__.Dsl.SectionName.EntityName, and you could then have the entity struct be __MODULE__.SectionName.EntityName without conflicts.

To expose the configuration of your DSL, define functions that use the helpers like get_entities/2 and get_opt/3. For example:

defmodule MyApp.Cars do
  def cars(resource) do
    Spark.Dsl.Extension.get_entities(resource, [:cars])
  end
end

MyApp.Cars.cars(MyResource)
# [%MyApp.Car{...}, %MyApp.Car{...}]

See the documentation for Spark.Dsl.Section and Spark.Dsl.Entity for more information

Summary

Types

Callbacks

@callback add_extensions() :: [module()]
@callback explain(map()) :: String.t() | nil
@callback module_imports() :: [module()]
@callback persisters() :: [module()]
@callback sections() :: [Spark.Dsl.section()]
@callback transformers() :: [module()]
@callback verifiers() :: [module()]

Functions

Link to this function

doc(sections, depth \\ 1)

View Source

See Spark.CheatSheet.doc/2.

Link to this function

doc_index(sections, depth \\ 1)

View Source

See Spark.CheatSheet.doc_index/2.

Link to this function

expand_alias_no_require(ast, env)

View Source
Link to this function

expand_literals(ast, acc, fun)

View Source
Link to this function

fetch_opt(resource, path, value, configurable? \\ false)

View Source

Get the entities configured for a given section

Link to this function

get_entity_dsl_patches(extensions, section_path)

View Source
Link to this function

get_opt(resource, path, value, default \\ nil, configurable? \\ false)

View Source

Get an option value for a section at a given path.

Checks to see if it has been overridden via configuration.

Link to this function

get_opt_config(resource, path, value)

View Source
Link to this function

get_persisted(resource, key, default \\ nil)

View Source

Get a value that was persisted while transforming or compiling the resource, e.g :primary_key

Link to this function

get_recursive_entities_for_path(sections, list)

View Source
Link to this macro

import_mods(mods)

View Source (macro)
Link to this function

run_transformers(mod, transformers, spark_dsl_config, env)

View Source
Link to this macro

set_entity_opt(value, escaped_value, type, key)

View Source (macro)
Link to this macro

set_section_opt(value, escaped_value, type, section_path, extension, field)

View Source (macro)
Link to this function

shuffle_opts_to_end(keyword, entity_args, opts)

View Source
Link to this function

spark_function_info(arg1)

View Source
Link to this macro

unimport_mods(mods)

View Source (macro)