SlackBot.Command (slack_bot_ws v0.1.0-rc.2)
View SourceTokenization helpers for slash commands.
This module provides the lexer that powers SlackBot's slash command grammar DSL. It normalizes whitespace, respects quoted strings, and produces consistent tokens that the grammar matcher consumes.
When to Use This Module
You typically don't need to call these functions directly. The slash/2 macro
handles tokenization automatically. However, this module is useful when:
- Testing your slash grammar patterns
- Building custom command parsers outside the DSL
- Debugging tokenization issues
Tokenization Behavior
The lexer handles:
- Whitespace normalization - Multiple spaces collapse to single separators
- Quoted strings -
"multiple words"become a single token - Command extraction - Leading
/commandis separated from arguments
Examples
iex> SlackBot.Command.lex("/deploy api production")
%{command: "deploy", tokens: ["api", "production"]}
iex> SlackBot.Command.lex("/list \"John Smith\" --active")
%{command: "list", tokens: ["John Smith", "--active"]}
iex> SlackBot.Command.lex(" extra spaces ")
%{command: nil, tokens: ["extra", "spaces"]}Grammar DSL
The slash grammar DSL uses these tokens under the hood:
slash "/deploy" do
value :service # Matches one token
literal "canary" # Matches the literal word "canary"
repeat do
literal "env"
value :environments
end
handle payload, _ctx do
# payload["parsed"] contains the matched values
end
endSee Also
- Slash Grammar Guide
- The
slash/2macro for defining command grammars - Example app:
examples/basic_bot/
Summary
Functions
Parses the given binary as do_lex.
Tokenizes slash command text, returning the optional command literal and a list of tokens.
Functions
@spec do_lex(binary(), keyword()) :: {:ok, [term()], rest, context, line, byte_offset} | {:error, reason, rest, context, line, byte_offset} when line: {pos_integer(), byte_offset}, byte_offset: non_neg_integer(), rest: binary(), reason: String.t(), context: map()
Parses the given binary as do_lex.
Returns {:ok, [token], rest, context, position, byte_offset} or
{:error, reason, rest, context, line, byte_offset} where position
describes the location of the do_lex (start position) as {line, offset_to_start_of_line}.
To column where the error occurred can be inferred from byte_offset - offset_to_start_of_line.
Options
:byte_offset- the byte offset for the whole binary, defaults to 0:line- the line and the byte offset into that line, defaults to{1, byte_offset}:context- the initial context value. It will be converted to a map
Tokenizes slash command text, returning the optional command literal and a list of tokens.