Chat template formatting using llama.cpp's Jinja template engine.
Converts a list of chat messages into a formatted prompt string
using the model's embedded chat template. Uses the full Jinja engine
from llama.cpp's common library, which supports enable_thinking and
arbitrary chat_template_kwargs.
Examples
{:ok, prompt} = LlamaCppEx.Chat.apply_template(model, [
%{role: "system", content: "You are helpful."},
%{role: "user", content: "Hi!"}
])
# Disable thinking (for Qwen3 and similar models)
{:ok, prompt} = LlamaCppEx.Chat.apply_template(model, messages,
enable_thinking: false
)
Summary
Functions
Applies the model's chat template to a list of messages using the Jinja engine.
Types
Functions
@spec apply_template(LlamaCppEx.Model.t(), [message()], keyword()) :: {:ok, String.t()} | {:error, String.t()}
Applies the model's chat template to a list of messages using the Jinja engine.
Options
:add_assistant- Whether to add the assistant turn prefix. Defaults totrue.:enable_thinking- Whether to enable thinking/reasoning mode. Defaults totrue.:chat_template_kwargs- Extra template variables as a list of{key, value}string tuples. Defaults to[].