View Source LangChain.PromptTemplate (LangChain v0.3.0-rc.0)

Enables defining a prompt, optionally as a template, but delaying the final building of it until a later time when input values are substituted in.

This also supports the ability to create a Message from a PromptTemplate.

An LLM conversation is made up of a set of messages. PromptTemplates are a tool to help build messages.

# Create a template and convert it to a message
prompt = PromptTemplate.new!(%{text: "My template", role: :user})
%LangChain.Message{} = message = PromptTemplate.to_message(prompt)

PromptTemplates are powerful because they support Elixir's EEx templates allowing for parameter substitution. This is helpful when we want to prepare a template message and plan to later substitute in information from the user.

Here's an example of setting up a template using a parameter then later providing the input value.

prompt = PromptTemplate.from_template!("What's a name for a company that makes <%= @product %>?")

# later, format the final text after after applying the values.
PromptTemplate.format(prompt, %{product: "colorful socks"})
#=> "What's a name for a company that makes colorful socks?"

Summary

Functions

Format the prompt template with inputs to replace with assigns. It returns the formatted text.

Formats a PromptTemplate at two levels. Supports providing a list of composed_of templates that are all combined into a full_template.

Format the prompt template with inputs to replace embeds. The final replaced text is returned.

Build a PromptTemplate struct from a template string.

Build a PromptTemplate struct from a template string and return the struct or error if invalid.

Create a new PromptTemplate struct using the attributes. If invalid, an exception is raised with the reason.

Transform a PromptTemplate to a LangChain.Message.ContentPart of type text. Provide the inputs at the time of transformation to render the final content.

Transform a PromptTemplate to a LangChain.Message.ContentPart of type text. Provide the inputs at the time of transformation to render the final content. Raises an exception if invalid.

Transform a PromptTemplate to a LangChain.Message. Provide the inputs at the time of transformation to render the final content.

Transform a PromptTemplate to a LangChain.Message. Provide the inputs at the time of transformation to render the final content. Raises an exception if invalid.

Transform a list of PromptTemplates into a list of LangChain.Messages. Applies the inputs to the list of prompt templates. If any of the prompt entries are invalid or fail, an exception is raised.

Types

@type t() :: %LangChain.PromptTemplate{inputs: term(), role: term(), text: term()}

Functions

Link to this function

format(template, inputs \\ %{})

View Source
@spec format(t(), inputs :: %{required(atom()) => any()}) :: String.t()

Format the prompt template with inputs to replace with assigns. It returns the formatted text.

prompt = PromptTemplate.from_template!("Suggest a good name for a company that makes <%= @product %>?")
PromptTemplate.format(prompt, %{product: "colorful socks"})
#=> "Suggest a good name for a company that makes colorful socks?"

A PromptTemplate supports storing input values on the struct. These could be set when the template is defined. If an input value is not provided when the format function is called, any inputs on the struct will be used.

Link to this function

format_composed(full_prompt, composed_of, inputs)

View Source
@spec format_composed(
  t(),
  composed_of :: %{required(atom()) => any()},
  inputs :: %{required(atom()) => any()}
) :: String.t()

Formats a PromptTemplate at two levels. Supports providing a list of composed_of templates that are all combined into a full_template.

For this example, we'll use an overall template layout like this:

full_prompt =
  PromptTemplate.from_template!(~s(<%= @introduction %>

  <%= @example %>

  <%= @start %>))

This template is made up of 3 more specific templates. Let's start with the introduction sub-template.

introduction_prompt =
  PromptTemplate.from_template!("You are impersonating <%= @person %>.")

The introduction takes a parameter for which person it should impersonate. The desired person is not provided here and will come in later.

Let's next look at the example prompt:

example_prompt =
  PromptTemplate.from_template!(~s(Here's an example of an interaction:
    Q: <%= @example_q %>
    A: <%= @example_a %>))

This defines a sample interaction for the LLM as a model of what we're looking for. Primarily, this template is used to define the pattern we want to use for the interaction. Again, this template takes parameters for the sample question and answer.

Finally, there is the start section of the overall prompt. In this example, that might be a question presented by a user asking a question of our impersonating AI.

start_prompt =
  PromptTemplate.from_template!(~s(Now, do this for real!
  Q: <%= @input %>
  A:))

We have the overall template defined and templates that define each of the smaller portions. The format_composed function let's us combine those all together and build the complete text to pass to an LLM.

  formatted_prompt =
    PromptTemplate.format_composed(
      full_prompt,
      %{
        introduction: introduction_prompt,
        example: example_prompt,
        start: start_prompt
      },
      %{
        person: "Elon Musk",
        example_q: "What's your favorite car?",
        example_a: "Tesla",
        input: "What's your favorite social media site?"
      }
    )

We provide the PromptTemplate for the overall prompt, then provide the inputs, which are themselves prompts.

Finally, we provide a map of values for all the parameters that still need values. For this example, this is what the final prompt looks like that is presented to the LLM.

~s(You are impersonating Elon Musk.

Here's an example of an interaction:
Q: What's your favorite car?
A: Tesla

Now, do this for real!
Q: What's your favorite social media site?
A:)

Using a setup like this, we can easily swap out who we are impersonating and allow the user to interact with that persona.

With everything defined, this is all it takes to now talk with an Abraham Lincoln impersonation:

formatted_prompt =
  PromptTemplate.format_composed(
    full_prompt,
    %{
      introduction: introduction_prompt,
      example: example_prompt,
      start: start_prompt
    },
    %{
      person: "Abraham Lincoln",
      example_q: "What is your nickname?",
      example_a: "Honest Abe",
      input: "What is one of your favorite pastimes?"
    }
  )
Link to this function

format_text(text, inputs)

View Source
@spec format_text(text :: String.t(), inputs :: %{required(atom()) => any()}) ::
  String.t()

Format the prompt template with inputs to replace embeds. The final replaced text is returned.

Operates directly on text to apply the inputs. This does not take the PromptTemplate struct.

PromptTemplate.format_text("Hi! My name is <%= @name %>.", %{name: "Jose"})
#=> "Hi! My name is Jose."
@spec from_template(text :: String.t()) :: {:ok, t()} | {:error, Ecto.Changeset.t()}

Build a PromptTemplate struct from a template string.

Shortcut function for building a user prompt.

{:ok, prompt} = PromptTemplate.from_template("Suggest a good name for a company that makes <%= @product %>?")
@spec from_template!(text :: String.t()) :: t() | no_return()

Build a PromptTemplate struct from a template string and return the struct or error if invalid.

Shortcut function for building a user prompt.

prompt = PromptTemplate.from_template!("Suggest a good name for a company that makes <%= @product %>?")
@spec new(attrs :: map()) :: {:ok, t()} | {:error, Ecto.Changeset.t()}
@spec new!(attrs :: map()) :: t() | no_return()

Create a new PromptTemplate struct using the attributes. If invalid, an exception is raised with the reason.

Link to this function

to_content_part(template, inputs \\ %{})

View Source
@spec to_content_part(t(), input :: %{required(atom()) => any()}) ::
  {:ok, LangChain.Message.t()} | {:error, Ecto.Changeset.t()}

Transform a PromptTemplate to a LangChain.Message.ContentPart of type text. Provide the inputs at the time of transformation to render the final content.

Link to this function

to_content_part!(template, inputs \\ %{})

View Source
@spec to_content_part!(t(), input :: %{required(atom()) => any()}) ::
  {:ok, LangChain.Message.t()} | {:error, Ecto.Changeset.t()}

Transform a PromptTemplate to a LangChain.Message.ContentPart of type text. Provide the inputs at the time of transformation to render the final content. Raises an exception if invalid.

Link to this function

to_message(template, inputs \\ %{})

View Source
@spec to_message(t(), input :: %{required(atom()) => any()}) ::
  {:ok, LangChain.Message.t()} | {:error, Ecto.Changeset.t()}

Transform a PromptTemplate to a LangChain.Message. Provide the inputs at the time of transformation to render the final content.

Link to this function

to_message!(template, inputs \\ %{})

View Source
@spec to_message!(t(), input :: %{required(atom()) => any()}) ::
  LangChain.Message.t() | no_return()

Transform a PromptTemplate to a LangChain.Message. Provide the inputs at the time of transformation to render the final content. Raises an exception if invalid.

Link to this function

to_messages!(prompts, inputs \\ %{})

View Source
@spec to_messages!(
  [t() | LangChain.Message.t() | String.t()],
  inputs :: %{required(atom()) => any()}
) ::
  [LangChain.Message.t()] | no_return()

Transform a list of PromptTemplates into a list of LangChain.Messages. Applies the inputs to the list of prompt templates. If any of the prompt entries are invalid or fail, an exception is raised.