View Source Changelog

v0.2.0 (2024-04-30)

For LLMs that support it (verified with ChatGPT and Anthropic), a user message can now contain multiple ContentParts, making it "multi-modal". This means images and text can be combined into a single message allowing for interactions about the images to now be possible.

Added:

Changed:

  • The roles of :function and :function_call are removed. The equivalent of a function_call is expressed by an :assistant role making one or more ToolCall requests. The :function was the system's answer to a function call. This is now in the :tool role.
  • Role :tool was added. A tool message contains one or more ToolResult messages.

v0.1.10 (2024-03-07)

Changes

v0.1.9 (2024-02-29) - The Leap Release!

This adds support for Bumblebee as a Chat model, making it easy to have conversations with Llama 2, Mistral, and Zephyr LLMs.

See the documentation in LangChain.ChatModels.ChatBumblebee for getting started.

NOTE: That at this time, none of the models support the function ability, so that is not supported yet.

This release includes an experimental change for better support of streamed responses that are broken up over multiple messages from services like ChatGPT and others.

Other library dependencies requirements were relaxed, making it easier to support different versions of libraries like req and nx.

v0.1.8 (2024-02-16)

Breaking change: RoutingChain's required values changed. Previously, default_chain was assigned an %LLMChain{} to return when no more specific routes matched.

This was changed to be default_route. It now expects a %PromptRoute{} to be provided.

Here's how to make the change:

  selected_route =
    RoutingChain.new(%{
      llm: ChatOpenAI.new(%{model: "gpt-3.5-turbo", stream: false}),
      input_text: user_input_text,
      routes: routes,
      default_route: PromptRoute.new!(%{name: "DEFAULT", chain: fallback_chain})
    })
    |> RoutingChain.evaluate()

The default_chain was changed to default_route and now expects a PromptRoute to be provided. The above example includes a sample default route that includes an optional fallback_chain.

Previously, the returned value from RoutingChain.evaluate/1 was a selected_chain; it now returns the selected_route.

Why was this changed?

This was changed to make it easier to use a PromptChain when there isn't an associated %LLMChain{} for it. The application must just need the answer of which route was selected.

This includes the change of not requiring a %PromptChain{}'s description or chain field.

Other Changes

v0.1.7 (2024-01-18)

v0.1.6 (2023-12-12)

  • Fix for correct usage of new Req retry setting. PR #57

v0.1.5 (2023-12-11)

  • Upgraded Req to v0.4.8. It contains a needed retry fix for certain situations.
  • Fix OpenAI returns "Unrecognized request argument supplied: api_key" PR #54

v0.1.4 (2023-12-11)

v0.1.3 (2023-12-01)

v0.1.2 (2023-10-26)

v0.1.1 (2023-10-10)

Minor update release.

  • added "update_custom_context" to LLMChain
  • added support for setting the OpenAI-Organization header in requests
  • fixed data extraction chain and improved the prompt
  • make chatgpt response tests more robust

v0.1.0 (2023-09-18)

Initial release when published to hex.pm.