View Source LangChain.ChatModels.ChatOrq (LangChain v0.4.0)
Chat adapter for orq.ai Deployments API.
Non-streaming:
Streaming (SSE, sentinel "[DONE]"):
Security:
- HTTP Bearer token (Authorization: Bearer ...). Configure via application env :langchain, :orq_key
Required body field:
- key: Deployment key to invoke.
Messages:
Accepts roles: developer | system | user | assistant | tool
- Content supports text, image_url, file, input_audio
- Assistant may include tool_calls; Tool results are role: tool with tool_call_id
Notes:
- Azure is not supported.
Summary
Functions
Calls the orq.ai API passing the ChatOrq struct with configuration, plus either a simple message or the list of messages to act as the prompt.
Convert a ContentPart to the expected map of data for the API.
Convert a list of ContentParts to the expected map of data for the API.
Convert a list of ContentParts to a string for tool results. ORQ API expects tool result content to be a string, not an array.
Convert content to a list of ContentParts. Content may be a string or already a list of ContentParts.
Convert content to a single ContentPart for MessageDelta. Content may be a string or already a ContentPart.
Decode a streamed response (SSE). Delegates to ChatOpenAI-compatible decoder.
Return the params formatted for an API request.
Setup a ChatOrq client configuration.
Setup a ChatOrq client configuration and return it or raise an error if invalid.
Restores the model from the config.
Determine if an error should be retried. If true, a fallback LLM may be
used. If false, the error is understood to be more fundamental with the
request rather than a service issue and it should not be retried or fallback
to another service.
Generate a config map that can later restore the model's configuration.
Types
@type t() :: %LangChain.ChatModels.ChatOrq{ api_key: term(), callbacks: term(), context: term(), documents: term(), endpoint: term(), extra_params: term(), file_ids: term(), inputs: term(), invoke_options: term(), key: term(), knowledge_filter: term(), messages_passthrough: term(), metadata: term(), model: term(), prefix_messages: term(), receive_timeout: term(), stream: term(), stream_endpoint: term(), thread: term(), verbose_api: term() }
Functions
Calls the orq.ai API passing the ChatOrq struct with configuration, plus either a simple message or the list of messages to act as the prompt.
Optionally pass in a list of tools available to the LLM for requesting execution in response (tools schema is not sent to orq; tool messages are included in messages).
Convert a ContentPart to the expected map of data for the API.
Convert a list of ContentParts to the expected map of data for the API.
Convert a list of ContentParts to a string for tool results. ORQ API expects tool result content to be a string, not an array.
Convert content to a list of ContentParts. Content may be a string or already a list of ContentParts.
Convert content to a single ContentPart for MessageDelta. Content may be a string or already a ContentPart.
Decode a streamed response (SSE). Delegates to ChatOpenAI-compatible decoder.
@spec for_api( struct(), LangChain.Message.t() | LangChain.Message.ToolCall.t() | LangChain.Message.ToolResult.t() | LangChain.Message.ContentPart.t() | LangChain.Function.t() ) :: %{required(String.t()) => any()} | [%{required(String.t()) => any()}]
@spec for_api( t() | LangChain.Message.t() | LangChain.Message.ToolCall.t() | LangChain.Message.ToolResult.t() | LangChain.Message.ContentPart.t(), message :: [map()], LangChain.ChatModels.ChatModel.tools() ) :: %{required(atom()) => any()}
Return the params formatted for an API request.
@spec new(attrs :: map()) :: {:ok, t()} | {:error, Ecto.Changeset.t()}
Setup a ChatOrq client configuration.
Setup a ChatOrq client configuration and return it or raise an error if invalid.
Restores the model from the config.
@spec retry_on_fallback?(LangChain.LangChainError.t()) :: boolean()
Determine if an error should be retried. If true, a fallback LLM may be
used. If false, the error is understood to be more fundamental with the
request rather than a service issue and it should not be retried or fallback
to another service.
Generate a config map that can later restore the model's configuration.