Markov (markov v1.0.0)
Markov-chain-based trained text generator implementation. Next token prediction uses two previous tokens.
Link to this section Summary
Functions
Generates a string of text using the chain
Generates a list of tokens using the chain
Predicts the next state of a chain assuming current state.
Trains chain using text or a list of tokens.
Link to this section Functions
generate_text(chain, state \\ [:start, :start])
Specs
Generates a string of text using the chain
Optionally assumes the previous two states were [state1, state2]=state.
Returns the generated text.
Example
iex> %Markov{} |> Markov.train("hello, world!") |> Markov.generate_text()
"hello, world!"
iex> %Markov{} |> Markov.train("hello, world!") |>
...> Markov.generate_text([:start, "hello,"])
"world!"
generate_tokens(chain, acc \\ [], state \\ [:start, :start])
Specs
Generates a list of tokens using the chain
Optionally prepends acc to it and assumes the previous
two states were [state1, state2]=state.
Returns the generated list.
Example
iex> %Markov{} |> Markov.train([:a, :b, :c]) |> Markov.generate_tokens()
[:a, :b, :c]
iex> %Markov{} |> Markov.train([:a, :b, :c]) |>
...> Markov.generate_tokens([], [:a, :b])
[:c]
next_state(chain, current)
Specs
Predicts the next state of a chain assuming current state.
Note: current state conists of two tokens.
Returns the next predicted state.
Example
iex> %Markov{} |> Markov.train("1 2 3 4 5") |> Markov.next_state(["2", "3"])
"4"
iex> %Markov{} |> Markov.train("1 2") |> Markov.next_state([:start, :start])
"1"
iex> %Markov{} |> Markov.train([:a, :b, :c]) |> Markov.next_state([:a, :b])
:c
train(chain, text)
Specs
Trains chain using text or a list of tokens.
Returns the modified chain.
Example
chain = %Markov{}
|> Markov.train("hello, world!")
|> Markov.train("example string number two")
|> Markov.train("hello, Elixir!")
|> Markov.train("fourth string")
chain = %Markov{}
|> Markov.train(["individual tokens", :can_be, 'arbitrary terms'])