Spf.Lexer (Spfcheck v0.10.0) View Source
Lexer for SPF strings and explain-strings.
See the collected ABNF.
Link to this section Summary
Types
qualifier = ?+ / ?- / ?~ / ??
range
denotes a token's start..stop
-slice in the input string.
An ok/error tuple produced by lexing some input
A token represented as a tuple: {type, list, range}
.
The token's type
.
Link to this section Types
Specs
A lexer is a function that takes a binary & a lexer-context, and returns a result/0
Specs
q() :: 43 | 45 | 126 | 63
qualifier = ?+ / ?- / ?~ / ??
Specs
range() :: Range.t()
range
denotes a token's start..stop
-slice in the input string.
Specs
An ok/error tuple produced by lexing some input
Specs
A token represented as a tuple: {type, list, range}
.
Where:
type
is an atom which denotes the tokentype/0
list
may be empty or contain one or more values (including subtokens)range
is thestart..stop
-slice in the input string
Specs
type() :: :a | :all | :exists | :exp | :include | :ip4 | :ip6 | :mx | :ptr | :redirect | :unknown | :version | :whitespace | :error | :exp_str | :cidr | :expand | :literal
The token's type
.
There are several classes of tokens:
- the version:
:version
, - a mechanism:
:a, :all, :exists, :exp, :include, :ip4, :ip6, :mx, :ptr
- a modifier:
:exp, :redirect
, - an explain string:
:exp_str
, - an unknown modifier:
:unknown
, - a syntax error:
:error
- whitespace:
:whitespace
, - a subtoken:
:expand, :literal, :cidr
Subtokens may appear as part of another token's value.
:exp_str
is produced by tokenizing the explain string. After expanding the
domain-spec of modifier exp=domain-spec
into a domain name, that domain's
TXT RR is retrieved and tokenized for later expansion into an explanation
string. This only happens when the SPF verdict is :fail
and the exp
modifier is present and has a valid domain name.
The :whitespace
token will match both spaces and tab characters in order to
be able to warn about multiple spaces and/or tab characters being used. Use
of a tab character is technically a syntax error, but this library only warns
about its use.
The :error
token is tried as a last resort and matches any non-space
sequence. When matched, it means the SPF string has a syntax error.
Link to this section Functions
Specs
Returns a lexer result/0
after consuming an explain-string.
An explaing-string is the TXT RR value of the domain specified by the
domain specification of the exp
-modifier and is basically a series
of macro-strings and spaces. This is the only time c
, r
, t
-macros
may be used.
The lexer produces an :error
token for character-sequences it doesn't know.
Example
iex> {:ok, tokens, _rest, _map} = Spf.Lexer.tokenize_exp("timestamp %{t}")
iex> tokens
[
{:exp_str,
[{:literal, ["timestamp"], 0..8},
{:whitespace, [" "], 9..9},
{:expand, [116, -1, false, ["."]], 10..13}
], 0..13}
]
Specs
Returns a lexer result/0
after consuming an SPF string.
The SPF string can be a full SPF TXT string or a partial string.
The lexer produces a list of tokens found, including a catch-all
:error
token for character sequences that were not recognized.
Example
iex> {:ok, tokens, _rest, _map} = Spf.Lexer.tokenize_spf("a:%{d}")
iex> tokens
[{:a, [43, {:expand, [100, -1, false, ["."]], 2..5}, {:cidr, [32, 128], 1..0//-1}], 0..5}]