Arcana.Agent.Answerer.LLM (Arcana v1.3.3)
View SourceLLM-based answer generator.
Uses the configured LLM to generate answers from retrieved context.
This is the default answerer used by Agent.answer/2.
Usage
# With Agent pipeline (uses ctx.llm automatically)
ctx
|> Agent.search()
|> Agent.answer()
# Directly
{:ok, answer} = Arcana.Agent.Answerer.LLM.answer(
"What is Elixir?",
chunks,
llm: &my_llm/1
)Custom Prompts
Agent.answer(ctx,
prompt: fn question, chunks ->
context = Enum.map_join(chunks, "", & &1.text)
"Answer: " <> question <> "Context: " <> context
end
)