Module cuttlefish_variable

Data Types

variable()

variable() = [string()]

Function Index

extract_first_match/2
filter_by_prefix/2For Proplist, return the subset of the proplist that starts with "Key".
format/1Formats a variable back into its dot-separated version.
fuzzy_matches/2given a KeyDef "a.b.$c.d", what are the possible values for $c in the set of Keys in Conf = [{Key, Value}]?.
is_fuzzy_match/2could this fixed Key be a match for the variable key KeyDef? e.g.
replace_match/2replaces the $var in Key with Sub.
split_on_match/1split a key definition into: * Prefix: Things before the $var * Var: The $var itself * Suffix: Things after the $var.
tokenize/1like string:tokens(Key, "."), but if the dot was escaped i.e.

Function Details

extract_first_match/2

extract_first_match(VariableDef::variable(), Variable::variable()) -> nomatch | [{string(), string()}]

filter_by_prefix/2

filter_by_prefix(T::string() | [string()], Proplist::[{[string()], any()}]) -> [{[string()], any()}]

For Proplist, return the subset of the proplist that starts with "Key"

format/1

format(Key::variable()) -> string()

Formats a variable back into its dot-separated version. Inverse of tokenize/1.

fuzzy_matches/2

fuzzy_matches(VariableDef::variable(), Conf::cuttlefish_conf:conf()) -> [{string(), any()}]

given a KeyDef "a.b.$c.d", what are the possible values for $c in the set of Keys in Conf = [{Key, Value}]?

is_fuzzy_match/2

is_fuzzy_match(Variable::variable(), VariableDef::variable()) -> boolean()

could this fixed Key be a match for the variable key KeyDef? e.g. could a.b.$var.d =:= a.b.c.d?

replace_match/2

replace_match(Variable::variable(), Sub::string()) -> variable()

replaces the $var in Key with Sub

split_on_match/1

split_on_match(Variable::variable()) -> {variable(), string(), variable()}

split a key definition into: * Prefix: Things before the $var * Var: The $var itself * Suffix: Things after the $var

tokenize/1

tokenize(Key::string()) -> variable()

like string:tokens(Key, "."), but if the dot was escaped i.e. \\., don't tokenize that


Generated by EDoc