View Source LangChain.Utils.ChatTemplates (LangChain v0.2.0)
Functions for converting messages into the various commonly used chat template formats.
Format examples
There are currently no industry standards around model's chat formats. For any given model, it's documentation and config may need to be inspected for it's format, and it may be something not supported at this time.
:inst
<s>[INST] System message. User message. [/INST]
Assistant response
[INST] User message. [/INST]</s>
Assistant response
Note: The :inst
format has no concept of a system message. It will be
combined with the first user message
:im_start
<|im_start|>user
User message.<|im_end|>
<|im_start|>assistant
Assistant response.<|im_end|>
<|im_start|>user
User message.<|im_end|>
<|im_start|>assistant
Note: The :im_start
format has no concept of a system message. It will be
combined with the first user message.
:llama_2
<s>[INST] <<SYS>>
System message.
<</SYS>>
User message [/INST] Assistant response </s><s>[INST] User message. [/INST]
Note: The :llama_2
format supports specific system messages. It is a
variation of the :inst
format.
:llama_3
<|begin_of_text|>
<|start_header_id|>system<|end_header_id|>
System message.<|eot_id|>
<|start_header_id|>user<|end_header_id|>
User message.<|eot_id|>
<|start_header_id|>assistant<|end_header_id|>
Assistant message.<|eot_id|>
:zephyr
<|system|>
System message.</s>
<|user|>
User message.</s>
<|assistant|>
Assistant message.
<|user|>
User message.</s>
Note: The :zephyr
format supports specific system messages.
Summary
Functions
Transform a list of messages into a text prompt in the desired format for the LLM.
Validate that the message are in a supported format. Returns the system message (or uses a default one).
Types
@type chat_format() :: :inst | :im_start | :llama_2 | :llama_3 | :zephyr
Functions
@spec apply_chat_template!( [LangChain.Message.t()], chat_format(), opts :: Keyword.t() ) :: String.t() | no_return()
Transform a list of messages into a text prompt in the desired format for the LLM.
Options
:add_generation_prompt
- Boolean. Defaults totrue
when the last message is a user prompt. Depending on the format, when a user message is the last message, then the text prompt should begin the portion for the assistant to trigger the assistant's text generation.
@spec prep_and_validate_messages([LangChain.Message.t()]) :: {LangChain.Message.t(), LangChain.Message.t(), [LangChain.Message.t()]} | no_return()
Validate that the message are in a supported format. Returns the system message (or uses a default one).
Returns: {System Message, First User Message, Other Messages}
If there is an issue, an exception is raised. Reasons for an exception:
- Only 1 system message is allowed and, if included, it is the first message.
- Non-system messages must begin with a user message
- Alternates message roles between: user, assistant, user, assistant, etc.