# LangChain v0.5.1 - Table of Contents ## Guides - [README](readme.md) - [Changelog](changelog.md) - Notebooks - [Getting Started](getting_started.md) - [Executing Custom Elixir Functions](custom_functions.md) - [Images: Generating context-specific descriptions](context-specific-image-descriptions.md) ## Modules - [LangChain.ChatModels.ChatOrq](LangChain.ChatModels.ChatOrq.md): Chat adapter for orq.ai Deployments API. - [LangChain.ChatModels.ReasoningOptions](LangChain.ChatModels.ReasoningOptions.md): Embedded schema for OpenAI reasoning configuration options. - [LangChain.NativeTool](LangChain.NativeTool.md): Represents built-in tools available from AI/LLM services that can be used within the LangChain framework. - [LangChain.Telemetry](LangChain.Telemetry.md): Telemetry events for LangChain. - [LangChain.Tools.DeepResearch.ResearchResult.Source](LangChain.Tools.DeepResearch.ResearchResult.Source.md) - [LangChain.Tools.DeepResearch.ResearchResult.ToolCall](LangChain.Tools.DeepResearch.ResearchResult.ToolCall.md) - [LangChain.Tools.DeepResearch.ResearchResult.Usage](LangChain.Tools.DeepResearch.ResearchResult.Usage.md) - [LangChain.Utils.AwsEventstreamDecoder](LangChain.Utils.AwsEventstreamDecoder.md): Decodes AWS messages in the application/vnd.amazon.eventstream content-type. Ignores the headers because on Bedrock it's the same content type, event type & message type headers in every message. - [LangChain.Utils.BedrockStreamDecoder](LangChain.Utils.BedrockStreamDecoder.md) - [LangChain.Utils.Parser.LLAMA_3_1_CustomToolParser](LangChain.Utils.Parser.LLAMA_3_1_CustomToolParser.md) - [LangChain.Utils.Parser.LLAMA_3_2_CustomToolParser](LangChain.Utils.Parser.LLAMA_3_2_CustomToolParser.md) - Chat Models - [LangChain.ChatModels.ChatAnthropic](LangChain.ChatModels.ChatAnthropic.md): Module for interacting with [Anthropic models](https://docs.anthropic.com/claude/docs/models-overview#claude-3-a-new-generation-of-ai). - [LangChain.ChatModels.ChatBumblebee](LangChain.ChatModels.ChatBumblebee.md): Represents a chat model hosted by Bumblebee and accessed through an `Nx.Serving`. - [LangChain.ChatModels.ChatDeepSeek](LangChain.ChatModels.ChatDeepSeek.md): Module for interacting with [DeepSeek models](https://www.deepseek.com/). - [LangChain.ChatModels.ChatGoogleAI](LangChain.ChatModels.ChatGoogleAI.md): Parses and validates inputs for making a request for the Google AI Chat API. - [LangChain.ChatModels.ChatGrok](LangChain.ChatModels.ChatGrok.md): Module for interacting with [xAI's Grok models](https://docs.x.ai/docs/models). - [LangChain.ChatModels.ChatMistralAI](LangChain.ChatModels.ChatMistralAI.md) - [LangChain.ChatModels.ChatModel](LangChain.ChatModels.ChatModel.md) - [LangChain.ChatModels.ChatOllamaAI](LangChain.ChatModels.ChatOllamaAI.md): Represents the [Ollama AI Chat model](https://github.com/jmorganca/ollama/blob/main/docs/api.md#generate-a-chat-completion) - [LangChain.ChatModels.ChatOpenAI](LangChain.ChatModels.ChatOpenAI.md): Represents the [OpenAI ChatModel](https://platform.openai.com/docs/api-reference/chat/create). - [LangChain.ChatModels.ChatOpenAIResponses](LangChain.ChatModels.ChatOpenAIResponses.md): Represents the OpenAI Responses API - [LangChain.ChatModels.ChatPerplexity](LangChain.ChatModels.ChatPerplexity.md): Represents the [Perplexity Chat model](https://docs.perplexity.ai/api-reference/chat-completions). - [LangChain.ChatModels.ChatVertexAI](LangChain.ChatModels.ChatVertexAI.md): Parses and validates inputs for making a request for the Google AI Chat API. - Chains - [LangChain.Chains.DataExtractionChain](LangChain.Chains.DataExtractionChain.md): Defines an LLMChain for performing data extraction from a body of text. - [LangChain.Chains.LLMChain](LangChain.Chains.LLMChain.md): Define an LLMChain. This is the heart of the LangChain library. - [LangChain.Chains.SummarizeConversationChain](LangChain.Chains.SummarizeConversationChain.md): When an AI conversation has many back-and-forth messages (from user to assistant to user to assistant, etc.), the number of messages and the total token count can be large. Large token counts present the following problems - [LangChain.Chains.TextToTitleChain](LangChain.Chains.TextToTitleChain.md): A convenience chain for turning a user's prompt text into a summarized title for the anticipated conversation. - Messages - [LangChain.Message](LangChain.Message.md): Models a complete `Message` for a chat LLM. - [LangChain.Message.ContentPart](LangChain.Message.ContentPart.md): Models a `ContentPart`. ContentParts are now used for multi-modal support in both messages and tool results. This enables richer responses, allowing text, images, files, and thinking blocks to be combined in a single message or tool result. - [LangChain.Message.ToolCall](LangChain.Message.ToolCall.md): Represents an LLM's request to use tool. It specifies the tool to execute and may provide arguments for the tool to use. - [LangChain.Message.ToolResult](LangChain.Message.ToolResult.md): Represents a the result of running a requested tool. The LLM's requests a tool use through a `ToolCall`. A `ToolResult` returns the answer or result from the application back to the AI. - [LangChain.MessageDelta](LangChain.MessageDelta.md): Models a "delta" message from a chat LLM. A delta is a small chunk, or piece of a much larger complete message. A series of deltas are used to construct the complete message. - [LangChain.MessageProcessors.JsonProcessor](LangChain.MessageProcessors.JsonProcessor.md): A built-in Message processor that processes a received Message for JSON contents. - [LangChain.PromptTemplate](LangChain.PromptTemplate.md): Enables defining a prompt, optionally as a template, but delaying the final building of it until a later time when input values are substituted in. - [LangChain.TokenUsage](LangChain.TokenUsage.md): Contains token usage information returned from an LLM. - Functions - [LangChain.Function](LangChain.Function.md): Defines a "function" that can be provided to an LLM for the LLM to optionally execute and pass argument data to. - [LangChain.FunctionParam](LangChain.FunctionParam.md): Define a function parameter as a struct. Used to generate the expected JSONSchema data for describing one or more arguments being passed to a `LangChain.Function`. - Callbacks - [LangChain.Callbacks](LangChain.Callbacks.md): Defines the structure of callbacks and provides utilities for executing them. - [LangChain.Chains.ChainCallbacks](LangChain.Chains.ChainCallbacks.md): Defines the callbacks fired by an LLMChain and LLM module. - Routing - [LangChain.Chains.RoutingChain](LangChain.Chains.RoutingChain.md): Run a router based on a user's initial prompt to determine what category best matches from the given options. If there is no good match, the value "DEFAULT" is returned. - [LangChain.Routing.PromptRoute](LangChain.Routing.PromptRoute.md): Defines a route or direction a prompting interaction with an LLM can take. - Images - [LangChain.Images](LangChain.Images.md): Functions for working with `LangChain.GeneratedImage` files. - [LangChain.Images.GeneratedImage](LangChain.Images.GeneratedImage.md): Represents a generated image where we have either the base64 encoded contents or a temporary URL to it. - [LangChain.Images.OpenAIImage](LangChain.Images.OpenAIImage.md): Represents the [OpenAI Images API endpoint](https://platform.openai.com/docs/api-reference/images) for working with DALL-E-2 and DALL-E-3. - Text Splitter - [LangChain.TextSplitter.CharacterTextSplitter](LangChain.TextSplitter.CharacterTextSplitter.md): The `CharacterTextSplitter` is a length based text splitter that divides text based on specified characters. This splitter provides consistent chunk sizes. It operates as follows - [LangChain.TextSplitter.LanguageSeparators](LangChain.TextSplitter.LanguageSeparators.md): Separators lists for programming and markdown languages. Useful to use with `LangChain.TextSplitter.RecursiveCharacterTextSplitter`. - [LangChain.TextSplitter.RecursiveCharacterTextSplitter](LangChain.TextSplitter.RecursiveCharacterTextSplitter.md): The `RecursiveCharacterTextSplitter` is the recommended spliltter for generic text. It splits the text based on a list of characters. It uses each of these characters sequentially, until the text is split into small enough chunks. The default list is `[" - Tools - [LangChain.Tools.Calculator](LangChain.Tools.Calculator.md): Defines a Calculator tool for performing basic math calculations. - [LangChain.Tools.DeepResearch](LangChain.Tools.DeepResearch.md): Defines an OpenAI Deep Research tool for conducting comprehensive research on complex topics. - [LangChain.Tools.DeepResearch.ResearchRequest](LangChain.Tools.DeepResearch.ResearchRequest.md): Represents a Deep Research request sent to the OpenAI API. - [LangChain.Tools.DeepResearch.ResearchResult](LangChain.Tools.DeepResearch.ResearchResult.md): Represents the final result of a completed Deep Research request. - [LangChain.Tools.DeepResearch.ResearchStatus](LangChain.Tools.DeepResearch.ResearchStatus.md): Represents the status of a Deep Research request. - [LangChain.Tools.DeepResearchClient](LangChain.Tools.DeepResearchClient.md): HTTP client for OpenAI Deep Research API. - Utils - [LangChain.Config](LangChain.Config.md): Utility that handles interaction with the application's configuration. - [LangChain.Gettext](LangChain.Gettext.md): A module providing Internationalization with a gettext-based API. - [LangChain.Utils](LangChain.Utils.md): Collection of helpful utilities mostly for internal use. - [LangChain.Utils.BedrockConfig](LangChain.Utils.BedrockConfig.md): Configuration for AWS Bedrock. - [LangChain.Utils.ChainResult](LangChain.Utils.ChainResult.md): Module to help when working with the results of a chain. - [LangChain.Utils.ChatTemplates](LangChain.Utils.ChatTemplates.md): Functions for converting messages into the various commonly used chat template formats. - Exceptions - [LangChain.LangChainError](LangChain.LangChainError.md): Exception used for raising LangChain specific errors.