# `Spark.Dsl.Extension`
[🔗](https://github.com/ash-project/spark/blob/v2.7.0/lib/spark/dsl/extension.ex#L5)

An extension to the Spark DSL.

This allows configuring custom DSL components, whose configurations
can then be read back. This guide is still a work in progress, but should
serve as a decent example of what is possible. Open issues on Github if you
have any issues/something is unclear.

The example at the bottom shows how you might build a (not very contextually
relevant) DSL extension that would be used like so:

    defmodule MyApp.Vehicle do
      use Spark.Dsl
    end

    defmodule MyApp.MyResource do
      use MyApp.Vehicle,
        extensions: [MyApp.CarExtension]

      cars do
        car :ford, :focus, trim: :sedan
        car :toyota, :corolla
      end
    end

The extension:

    defmodule MyApp.CarExtension do
      @car_schema [
        make: [
          type: :atom,
          required: true,
          doc: "The make of the car"
        ],
        model: [
          type: :atom,
          required: true,
          doc: "The model of the car"
        ],
        type: [
          type: :atom,
          required: true,
          doc: "The type of the car",
          default: :sedan
        ]
      ]

      @car %Spark.Dsl.Entity{
        name: :car,
        describe: "Adds a car",
        examples: [
          "car :ford, :focus"
        ],
        target: MyApp.Car,
        args: [:make, :model],
        schema: @car_schema
      }

      @cars %Spark.Dsl.Section{
        name: :cars, # The DSL constructor will be `cars`
        describe: """
        Configure what cars are available.

        More, deeper explanation. Always have a short one liner explanation,
        an empty line, and then a longer explanation.
        """,
        entities: [
          @car # See `Spark.Dsl.Entity` docs
        ],
        schema: [
          default_manufacturer: [
            type: :atom,
            doc: "The default manufacturer"
          ]
        ]
      }

      use Spark.Dsl.Extension, sections: [@cars]
    end

Often, we will need to do complex validation/validate based on the configuration
of other resources. Due to the nature of building compile time DSLs, there are
many restrictions around that process. To support these complex use cases, extensions
can include `transformers`, `persisters`, and `verifiers`. These run in the following
order:

1. **Transformers** — run during compilation, in dependency order (controlled by `before?/1`
   and `after?/1`). Can read and modify any part of the DSL state. See `Spark.Dsl.Transformer`.

2. **Persisters** — run during compilation, always after all transformers. They implement the
   same `Spark.Dsl.Transformer` behaviour but are declared under `persisters:`. By convention
   they should only write to the persisted data map via `Spark.Dsl.Transformer.persist/3`.
   They support `before?`/`after?` ordering relative to other persisters, but any ordering
   declarations targeting transformers are silently ignored.

3. **Verifiers** — run after the module is compiled. Read-only. Do not create compile-time
   dependencies between modules, so they are safe to use when referencing other Spark-based
   modules. See `Spark.Dsl.Verifier`.

All three are provided as options to `use`:

    use Spark.Dsl.Extension,
      sections: [@cars],
      transformers: [MyApp.Transformers.ValidateNoOverlappingMakesAndModels],
      persisters: [MyApp.Persisters.CacheCarCount],
      verifiers: [MyApp.Verifiers.CheckManufacturerExists]

By default, the generated modules will have names like `__MODULE__.SectionName.EntityName`, and that could
potentially conflict with modules you are defining, so you can specify the `module_prefix` option, which would allow
you to prefix the modules with something like `__MODULE__.Dsl`, so that the module path generated might be something like
`__MODULE__.Dsl.SectionName.EntityName`, and you could then have the entity struct be `__MODULE__.SectionName.EntityName`
without conflicts.

To expose the configuration of your DSL, define functions that use the
helpers like `get_entities/2` and `get_opt/3`. For example:

    defmodule MyApp.Cars do
      def cars(resource) do
        Spark.Dsl.Extension.get_entities(resource, [:cars])
      end
    end

    MyApp.Cars.cars(MyResource)
    # [%MyApp.Car{...}, %MyApp.Car{...}]

See the documentation for `Spark.Dsl.Section` and `Spark.Dsl.Entity` for more information

# `t`

```elixir
@type t() :: module()
```

# `add_extensions`

```elixir
@callback add_extensions() :: [module()]
```

# `explain`
*optional* 

```elixir
@callback explain(map()) :: String.t() | nil
```

# `module_imports`

```elixir
@callback module_imports() :: [module()]
```

# `persisters`

```elixir
@callback persisters() :: [module()]
```

# `sections`

```elixir
@callback sections() :: [Spark.Dsl.section()]
```

# `transformers`

```elixir
@callback transformers() :: [module()]
```

# `verifiers`

```elixir
@callback verifiers() :: [module()]
```

# `default_section_config`

```elixir
@spec default_section_config() :: %{
  section_anno: :erl_anno.anno() | nil,
  entities: list(),
  opts: Keyword.t(),
  opts_anno: Keyword.t(:erl_anno.anno() | nil)
}
```

# `doc`

# `doc_index`

# `expand_alias`

# `expand_alias_no_require`

# `expand_literals`

# `fetch_opt`

# `fetch_persisted`

Fetch a value that was persisted while transforming or compiling the resource, e.g `:primary_key`

# `get_entities`

Get the entities configured for a given section

# `get_entity_dsl_patches`

# `get_opt`

Get an option value for a section at a given path.

Checks to see if it has been overridden via configuration.

# `get_opt_anno`

```elixir
@spec get_opt_anno(map() | module(), atom() | [atom()], atom()) ::
  :erl_anno.anno() | nil
```

Get the annotation for a specific option in a section.

# `get_opt_config`

# `get_persisted`

Get a value that was persisted while transforming or compiling the resource, e.g `:primary_key`

# `get_recursive_entities_for_path`

# `get_section_anno`

```elixir
@spec get_section_anno(map() | module(), atom() | [atom()]) :: :erl_anno.anno() | nil
```

Get the annotation for a section at the given path.

# `macro_env_anno`

```elixir
@spec macro_env_anno(env :: Macro.Env.t(), do_block :: Macro.t()) :: :erl_anno.anno()
```

# `module_concat`

# `monotonic_number`

# `run_transformers`

# `set_docs`

# `shuffle_opts_to_end`

# `spark_function_info`

# `validate_and_transform_dsl_patches`

# `validate_and_transform_entity`

Validates and transforms an entity structure, ensuring nested entities are properly formatted.

This function recursively processes a DSL entity and its nested entities, converting
single entity values to lists where needed and validating the structure.

## Parameters

- `entity` - The entity to validate and transform
- `path` - The current path in the DSL structure (for error reporting)
- `module` - The module context (for error reporting)

## Returns

Returns the transformed entity with normalized nested entity structures.

---

*Consult [api-reference.md](api-reference.md) for complete listing*
