Gralkor.Distill (gralkor_ex v2.1.3)

Copy Markdown View Source

Render a list of conversation turns into an episode body suitable for ingesting into the knowledge graph.

Each turn that contains a "behaviour" message gets distilled by the configured LLM into a first-person past-tense summary and rendered as {agent_name}: (behaviour: {summary}) before the assistant text. Turns without behaviour skip the LLM entirely.

Distillation per turn is best-effort: any failure (LLM error, exception) drops the behaviour line for that turn and preserves the user/assistant text — the surrounding turns still produce output.

Turns with behaviour are distilled in parallel via Task.async_stream.

See ex-format-transcript in gralkor/TEST_TREES.md.

Summary

Functions

Schema for the structured-output response the LLM returns when distilling a behaviour-containing turn.

Render turns (a list of turns; each turn a list of canonical Messages) into the episode body string.

Types

distill_fn()

@type distill_fn() :: (String.t() -> {:ok, String.t()} | {:error, term()}) | nil

turn()

@type turn() :: [Gralkor.Message.t()]

Functions

distill_schema()

@spec distill_schema() :: keyword()

Schema for the structured-output response the LLM returns when distilling a behaviour-containing turn.

format_transcript(turns, distill_fn, agent_name)

@spec format_transcript([turn()], distill_fn(), String.t()) :: String.t()

Render turns (a list of turns; each turn a list of canonical Messages) into the episode body string.

distill_fn is the LLM caller used to summarise behaviour messages. Pass nil to skip distillation entirely (behaviour lines are silently omitted).

agent_name is required and non-blank — used to label assistant and behaviour lines (e.g. "Susu: hello", "Susu: (behaviour: thought)").