Markov (markov v1.3.0)
Markov-chain-based trained text generator implementation. Next token prediction uses two previous tokens.
Link to this section Summary
Functions
Adjust the probabilities of a batch of connections
Adjusts the probability of one connection
Shifts probabilities if the model has a corresponding flag
Enables token sanitization on a chain.
When this mode is enabled, the chain doesn't understand the difference similar textual tokens.
This mode can't be disabled once it has been enabled.
Removes a token from all generation paths chain could produce.
Generates a string of text using the chain
Generates a list of tokens using the chain
Predicts the next state of a chain assuming current state.
Trains chain using text or a list of tokens.
Link to this section Functions
adjust_batch_probs(params)
Adjust the probabilities of a batch of connections
adjust_one_prob(param_tensor)
Adjusts the probability of one connection
cond_shift_probs(links, arg2)
Specs
cond_shift_probs(%{required([any()]) => any()}, %Markov{
links: term(),
sanitize_tokens: term(),
shift: term()
}) :: %{required([any()]) => any()}
Shifts probabilities if the model has a corresponding flag
enable_token_sanitization(chain)
Specs
enable_token_sanitization(%Markov{
links: term(),
sanitize_tokens: term(),
shift: term()
}) ::
%Markov{links: term(), sanitize_tokens: term(), shift: term()}
Enables token sanitization on a chain.
When this mode is enabled, the chain doesn't understand the difference similar textual tokens.
This mode can't be disabled once it has been enabled.
Returns the modified chain.
forget_token(chain, token)
Specs
forget_token(
%Markov{links: term(), sanitize_tokens: term(), shift: term()},
any()
) :: %Markov{
links: term(),
sanitize_tokens: term(),
shift: term()
}
Removes a token from all generation paths chain could produce.
Returns the modifier chain
Example
iex> %Markov{} |>
...> Markov.train("a b c") |>
...> Markov.forget_token("b") |>
...> Markov.generate_text()
"a"
generate_text(chain, state \\ [:start, :start])
Specs
generate_text(%Markov{links: term(), sanitize_tokens: term(), shift: term()}, [
any()
]) ::
String.t()
Generates a string of text using the chain
Optionally assumes the previous two states were [state1, state2]=state.
Returns the generated text.
Example
iex> %Markov{} |> Markov.train("hello, world!") |> Markov.generate_text()
"hello, world!"
iex> %Markov{} |> Markov.train("hello, world!") |>
...> Markov.generate_text([:start, "hello,"])
"world!"
generate_tokens(chain, acc \\ [], state \\ [:start, :start], limit \\ 100)
Specs
generate_tokens(
%Markov{links: term(), sanitize_tokens: term(), shift: term()},
[any()],
[any()],
integer()
) :: [any()]
Generates a list of tokens using the chain
Optionally prepends acc to it and assumes the previous
two states were [state1, state2]=state. The amount of
the resulting token list is limited by limit.
Returns the generated list.
Example
iex> %Markov{} |> Markov.train([:a, :b, :c]) |> Markov.generate_tokens()
[:a, :b, :c]
iex> %Markov{} |> Markov.train([:a, :b, :c]) |>
...> Markov.generate_tokens([], [:a, :b])
[:c]
next_state(chain, current)
Specs
Predicts the next state of a chain assuming current state.
Note: current state conists of two tokens.
Returns the next predicted state.
Example
iex> %Markov{} |> Markov.train("1 2 3 4 5") |> Markov.next_state(["2", "3"])
"4"
iex> %Markov{} |> Markov.train("1 2") |> Markov.next_state([:start, :start])
"1"
iex> %Markov{} |> Markov.train([:a, :b, :c]) |> Markov.next_state([:a, :b])
:c
train(chain, text)
Specs
train(
%Markov{links: term(), sanitize_tokens: term(), shift: term()},
String.t() | [any()]
) ::
%Markov{links: term(), sanitize_tokens: term(), shift: term()}
Trains chain using text or a list of tokens.
Returns the modified chain.
Example
chain = %Markov{}
|> Markov.train("hello, world!")
|> Markov.train("example string number two")
|> Markov.train("hello, Elixir!")
|> Markov.train("fourth string")
chain = %Markov{}
|> Markov.train(["individual tokens", :can_be, 'arbitrary terms'])