Langfuse.Prompt (Langfuse v0.1.0)
View SourceFetch, cache, and compile prompts from Langfuse.
Prompts in Langfuse enable version-controlled prompt management. This module provides functions to fetch prompts, compile them with variables, and link them to generations for tracking which prompt version was used.
Prompt Types
Langfuse supports two prompt types:
:text- Simple string prompts with{{variable}}placeholders:chat- List of message maps with role/content structure
Fetching Prompts
Fetch prompts by name, with optional version or label:
{:ok, prompt} = Langfuse.Prompt.get("my-prompt")
{:ok, prompt} = Langfuse.Prompt.get("my-prompt", version: 2)
{:ok, prompt} = Langfuse.Prompt.get("my-prompt", label: "production")Compiling Prompts
Substitute variables in prompt templates:
{:ok, prompt} = Langfuse.Prompt.get("greeting")
compiled = Langfuse.Prompt.compile(prompt, %{name: "Alice"})Linking to Generations
Track which prompt version was used in a generation:
{:ok, prompt} = Langfuse.Prompt.get("chat-template")
generation = Langfuse.generation(trace,
name: "completion",
model: "gpt-4",
prompt_name: prompt.name,
prompt_version: prompt.version
)Caching
Prompts are cached by default for 60 seconds. Configure TTL per request:
{:ok, prompt} = Langfuse.Prompt.get("my-prompt", cache_ttl: 300_000)Use fetch/2 to bypass the cache entirely.
Summary
Functions
Compiles a prompt by substituting variables.
Fetches a prompt directly from Langfuse without caching.
Fetches a prompt from Langfuse with caching.
Invalidates a cached prompt by name.
Clears all cached prompts.
Returns prompt metadata for linking to generations.
Types
@type prompt_type() :: :text | :chat
Prompt type: text or chat.
@type t() :: %Langfuse.Prompt{ config: map() | nil, labels: [String.t()], name: String.t(), prompt: String.t() | [map()], tags: [String.t()], type: prompt_type(), version: pos_integer() }
A prompt struct containing all prompt attributes.
The :prompt field contains the template content: a string for text
prompts, or a list of message maps for chat prompts.
Functions
Compiles a prompt by substituting variables.
For text prompts, replaces {{variable}} patterns in the template string.
For chat prompts, replaces variables in each message's content field.
Variable names in the template must match the keys in the variables map. Keys can be atoms or strings.
Examples
# Text prompt with template: "Hello {{name}}, let's talk about {{topic}}"
compiled = Langfuse.Prompt.compile(text_prompt, %{name: "Alice", topic: "weather"})
# => "Hello Alice, let's talk about weather"
# Chat prompt with system message containing {{user_name}}
compiled = Langfuse.Prompt.compile(chat_prompt, %{user_name: "Bob"})
# => [%{"role" => "system", "content" => "You are helping Bob"}, ...]
# Using atom keys
compiled = Langfuse.Prompt.compile(prompt, %{name: "Alice"})
# Using string keys
compiled = Langfuse.Prompt.compile(prompt, %{"name" => "Alice"})
Fetches a prompt directly from Langfuse without caching.
Use this when you need the latest version and want to bypass the cache.
Options
:version- Specific version number to fetch.:label- Label to fetch (e.g., "production", "latest").
Examples
{:ok, prompt} = Langfuse.Prompt.fetch("my-prompt")
{:ok, prompt} = Langfuse.Prompt.fetch("my-prompt", version: 3)
Fetches a prompt from Langfuse with caching.
Returns the cached prompt if available and not expired. Otherwise, fetches from the API and caches the result.
Options
:version- Specific version number to fetch.:label- Label to fetch (e.g., "production", "latest").:cache_ttl- Cache TTL in milliseconds. Defaults to 60,000 (1 minute).:fallback- Fallback prompt struct or template to use if fetch fails. Can be a%Langfuse.Prompt{}struct or a string template.
Examples
{:ok, prompt} = Langfuse.Prompt.get("my-prompt")
prompt.name
#=> "my-prompt"
{:ok, prompt} = Langfuse.Prompt.get("my-prompt", version: 2)
{:ok, prompt} = Langfuse.Prompt.get("my-prompt", label: "production")
{:ok, prompt} = Langfuse.Prompt.get("my-prompt", cache_ttl: 300_000)
# With fallback prompt struct
fallback = %Langfuse.Prompt{
name: "my-prompt",
version: 0,
type: :text,
prompt: "Default template {{name}}",
labels: [],
tags: []
}
{:ok, prompt} = Langfuse.Prompt.get("my-prompt", fallback: fallback)
# With fallback template string (creates text prompt)
{:ok, prompt} = Langfuse.Prompt.get("my-prompt",
fallback: "Default template {{name}}"
)Errors
Returns {:error, :not_found} if the prompt does not exist and no fallback provided.
Invalidates a cached prompt by name.
Removes all cached versions and labels for the given prompt name.
Use this when you know a prompt has been updated in Langfuse and want
to force a fresh fetch on the next get/2 call.
Options
:version- Only invalidate a specific version.:label- Only invalidate a specific label.
Examples
iex> Langfuse.Prompt.invalidate("my-prompt")
:ok
iex> Langfuse.Prompt.invalidate("my-prompt", version: 2)
:ok
iex> Langfuse.Prompt.invalidate("my-prompt", label: "production")
:ok
@spec invalidate_all() :: :ok
Clears all cached prompts.
Use this to force fresh fetches for all prompts. This is useful when deploying new prompt versions across the board.
Examples
iex> Langfuse.Prompt.invalidate_all()
:ok
@spec link_meta(t()) :: %{prompt_name: String.t(), prompt_version: pos_integer()}
Returns prompt metadata for linking to generations.
The returned map can be merged into generation options to track which prompt version was used.
Examples
{:ok, prompt} = Langfuse.Prompt.get("my-prompt")
meta = Langfuse.Prompt.link_meta(prompt)
# => %{prompt_name: "my-prompt", prompt_version: 2}
# Use with generation
generation = Langfuse.generation(trace,
[name: "completion", model: "gpt-4"] ++ Map.to_list(meta)
)