# `Arcana.Agent.Answerer.LLM`
[🔗](https://github.com/georgeguimaraes/arcana/blob/main/lib/arcana/agent/answerer/llm.ex#L1)

LLM-based answer generator.

Uses the configured LLM to generate answers from retrieved context.
This is the default answerer used by `Agent.answer/2`.

## Usage

    # With Agent pipeline (uses ctx.llm automatically)
    ctx
    |> Agent.search()
    |> Agent.answer()

    # Directly
    {:ok, answer} = Arcana.Agent.Answerer.LLM.answer(
      "What is Elixir?",
      chunks,
      llm: &my_llm/1
    )

## Custom Prompts

    Agent.answer(ctx,
      prompt: fn question, chunks ->
        context = Enum.map_join(chunks, "
", & &1.text)
        "Answer: " <> question <> "

Context: " <> context
      end
    )

---

*Consult [api-reference.md](api-reference.md) for complete listing*
