View Source Spark.Dsl behaviour (spark v1.0.6)

The primary entry point for adding a DSL to a module.

To add a DSL to a module, add use Spark.Dsl, ...options. The options supported with use Spark.Dsl are:

  • :single_extension_kinds (list of atom/0) - The extension kinds that are allowed to have a single value. For example: [:data_layer] The default value is [].

  • :many_extension_kinds (list of atom/0) - The extension kinds that can have multiple values. e.g [notifiers: [Notifier1, Notifier2]] The default value is [].

  • :untyped_extensions? (boolean/0) - Whether or not to support an extensions key which contains untyped extensions The default value is true.

  • :default_extensions (keyword/0) - The extensions that are included by default. e.g [data_layer: Default, notifiers: [Notifier1]] Default values for single extension kinds are overwritten if specified by the implementor, while many extension kinds are appended to if specified by the implementor. The default value is [].

  • :opt_schema (keyword/0) - A schema for additional options to accept when calling use YourSpark The default value is [].

See the callbacks defined in this module to augment the behavior/compilation of the module getting a Dsl.

schemas-data-types

Schemas/Data Types

Spark DSLs use a superset of NimbleOptions for the schema that makes up sections/entities of the DSL. For more information, see Spark.OptionsHelpers.

Link to this section Summary

Callbacks

Validate/add options. Those options will be passed to handle_opts and handle_before_compile

Handle options in the context of the module, after all extensions have been processed. Must return a quote block.

Handle options in the context of the module. Must return a quote block.

Validate/add options. Those options will be passed to handle_opts and handle_before_compile

Link to this section Types

@type entity() :: %Spark.Dsl.Entity{
  args: term(),
  auto_set_fields: term(),
  deprecations: term(),
  describe: term(),
  docs: term(),
  entities: term(),
  examples: term(),
  hide: term(),
  identifier: term(),
  imports: term(),
  links: term(),
  modules: term(),
  name: term(),
  no_depend_modules: term(),
  recursive_as: term(),
  schema: term(),
  snippet: term(),
  target: term(),
  transform: term()
}
@type opts() :: Keyword.t()
@type section() :: %Spark.Dsl.Section{
  auto_set_fields: term(),
  deprecations: term(),
  describe: term(),
  docs: term(),
  entities: term(),
  examples: term(),
  imports: term(),
  links: term(),
  modules: term(),
  name: term(),
  no_depend_modules: term(),
  patchable?: term(),
  schema: term(),
  sections: term(),
  snippet: term()
}
@type t() :: map()

Link to this section Callbacks

@callback explain(t(), opts()) :: String.t() | nil

Validate/add options. Those options will be passed to handle_opts and handle_before_compile

Link to this callback

handle_before_compile(t)

View Source
@callback handle_before_compile(Keyword.t()) :: Macro.t()

Handle options in the context of the module, after all extensions have been processed. Must return a quote block.

@callback handle_opts(Keyword.t()) :: Macro.t()

Handle options in the context of the module. Must return a quote block.

If you want to persist anything in the DSL persistence layer, use @persist {:key, value}. It can be called multiple times to persist multiple times.

@callback init(opts()) :: {:ok, opts()} | {:error, String.t() | term()}

Validate/add options. Those options will be passed to handle_opts and handle_before_compile

Link to this section Functions

Link to this function

handle_fragments(dsl_config, fragments)

View Source