View Source Absinthe.Lexer (absinthe v1.7.0)
Link to this section Summary
Functions
Parses the given binary
as do_tokenize.
Link to this section Functions
Specs
do_tokenize(binary(), keyword()) :: {:ok, [term()], rest, context, line, byte_offset} | {:error, reason, rest, context, line, byte_offset} when line: {pos_integer(), byte_offset}, byte_offset: pos_integer(), rest: binary(), reason: String.t(), context: map()
Parses the given binary
as do_tokenize.
Returns {:ok, [token], rest, context, position, byte_offset}
or
{:error, reason, rest, context, line, byte_offset}
where position
describes the location of the do_tokenize (start position) as {line, offset_to_start_of_line}
.
To column where the error occurred can be inferred from byte_offset - offset_to_start_of_line
.
Options
:byte_offset
- the byte offset for the whole binary, defaults to 0:line
- the line and the byte offset into that line, defaults to{1, byte_offset}
:context
- the initial context value. It will be converted to a map
Specs
tokenize(binary()) :: {:ok, [any()]} | {:error, binary(), {integer(), non_neg_integer()}}