Arcana.Ask (Arcana v1.3.3)
View SourceRAG (Retrieval Augmented Generation) question answering.
This module handles the core ask workflow:
- Search for relevant context chunks
- Build a prompt with the context
- Call the LLM for an answer
Usage
{:ok, answer, context} = Arcana.ask("What is X?",
repo: MyApp.Repo,
llm: "openai:gpt-4o-mini"
)
Summary
Functions
Asks a question using retrieved context from the knowledge base.
Functions
Asks a question using retrieved context from the knowledge base.
Performs a search to find relevant chunks, then passes them along with the question to an LLM for answer generation.
Options
:repo- The Ecto repo to use (required):llm- Any type implementing theArcana.LLMprotocol (required):limit- Maximum number of context chunks to retrieve (default: 5):source_id- Filter context to a specific source:threshold- Minimum similarity score for context (default: 0.0):mode- Search mode::semantic(default),:fulltext, or:hybrid:collection- Filter to a specific collection:collections- Filter to multiple collections:prompt- Custom prompt functionfn question, context -> system_prompt_string end
Examples
# Basic usage
{:ok, answer, context} = Arcana.ask("What is Elixir?",
repo: MyApp.Repo,
llm: "openai:gpt-4o-mini"
)
# With custom prompt
{:ok, answer, _} = Arcana.ask("Summarize the docs",
repo: MyApp.Repo,
llm: my_llm,
prompt: fn question, context ->
"Be concise. Question: #{question}"
end
)