View Source Spark.Dsl.Transformer behaviour (spark v1.0.6)
A transformer manipulates and/or validates the entire DSL state of a resource.
It's transform/1
takes a map
, which is just the values/configurations at each point
of the DSL. Don't manipulate it directly, if possible, instead use functions like
get_entities/3
and replace_entity/5
to manipulate it.
Use the after?/1
and before?/1
callbacks to ensure that your transformer
runs either before or after some other transformer.
Return true
in after_compile/0
to have the transformer run in an after_compile
hook,
but keep in mind that no modifications to the dsl structure will be retained, so there is no
real point in modifying the dsl that you return.
Link to this section Summary
Functions
Add a quoted expression to be evaluated in the DSL module's context.
Saves a value into the dsl config with the given key.
Link to this section Callbacks
Link to this section Functions
Add a quoted expression to be evaluated in the DSL module's context.
Use this extremely sparingly. It should almost never be necessary, unless building certain extensions that require the module in question to define a given function.
What you likely want is either one of the DSL introspection functions, like Spark.Dsl.Extension.get_entities/2
or Spark.Dsl.Extension.get_opt/5)
. If you simply want to store a custom value that can be retrieved easily, or
cache some precomputed information onto the resource, use persist/3
.
Provide the dsl state, bindings that should be unquote-able, and the quoted block
to evaluate in the module. For example, if we wanted to support a resource.primary_key()
function
that would return the primary key (this is unnecessary, just an example), we might do this:
fields = the_primary_key_fields
dsl_state =
Transformer.eval(
dsl_state,
[fields: fields],
quote do
def primary_key() do
unquote(fields)
end
end
)
Saves a value into the dsl config with the given key.
This can be used to precompute some information and cache it onto the resource,
or simply store a computed value. It can later be retrieved with Spark.Dsl.Extension.get_persisted/3
.