Markov (markov v0.1.0)

Markov-chain-based trained text generator implementation. Next token prediction uses two previous tokens.

Link to this section Summary

Functions

Generates a string of text using the chain

Predicts the next state of a chain assuming current state.

Trains chain using text.

Link to this section Functions

Link to this function

generate_text(chain, acc \\ "", state \\ [:start, :start])

Specs

generate_text(%Markov{links: term()}, acc :: String.t(), any()) :: String.t()

Generates a string of text using the chain

Optionally prepends acc to it and assumes the previous two states were [state1, state2]=state.

Returns the generated text.

Example

iex> %Markov{} |> Markov.train("hello, world!") |> Markov.generate_text()
"hello, world!"

iex> %Markov{} |> Markov.train("hello, world!") |>
...> Markov.generate_text("", [:start, "hello,"])
"world!"
Link to this function

next_state(chain, current)

Specs

next_state(%Markov{links: term()}, any()) :: any()

Predicts the next state of a chain assuming current state.

Note: current state conists of two tokens.

Returns the next predicted state.

Example

iex> %Markov{} |> Markov.train("1 2 3 4 5") |> Markov.next_state(["2", "3"])
"4"

iex> %Markov{} |> Markov.train("1 2") |> Markov.next_state([:start, :start])
"1"
Link to this function

train(chain, text)

Specs

train(%Markov{links: term()}, String.t()) :: %Markov{links: term()}

Trains chain using text.

Returns the modified chain.

Example

chain = %Markov{}
    |> Markov.train("hello, world!")
    |> Markov.train("example string number two")
    |> Markov.train("hello, Elixir!")
    |> Markov.train("fourth string")