View Source LangChain.Tools.DeepResearchClient (LangChain v0.4.0)
HTTP client for OpenAI Deep Research API.
This module handles the three main operations for OpenAI Deep Research:
- Creating a research request
- Checking request status
- Retrieving completed results
Uses the same authentication and HTTP client patterns as other OpenAI integrations
in LangChain, reusing the existing :openai_key configuration.
Configuration
The endpoint can be customized for alternative providers (Azure, OpenRouter, etc.):
# In config.exs or runtime.exs
config :langchain, :deep_research_endpoint, "https://custom-endpoint.com/v1/responses"If not configured, defaults to OpenAI's standard endpoint.
Summary
Functions
Checks the status of a deep research request.
Creates a new deep research request.
Retrieves the results of a completed deep research request.
Functions
Checks the status of a deep research request.
Parameters
request_id: The ID returned from create_research/2endpoint: Optional custom endpoint URL
Returns
{:ok, status_map}containing status information{:error, reason}on failure
Creates a new deep research request.
Parameters
query: The research question or topicoptions: Optional parameters including::model- The model to use (defaults to "o3-deep-research-2025-06-26"):system_message- Optional guidance for research approach:max_tool_calls- Maximum number of tool calls to make:summary- Summary mode: "auto" or "detailed" (defaults to "auto"):include_code_interpreter- Include code interpreter tool (defaults to true):endpoint- Custom endpoint URL (defaults to "https://api.openai.com/v1/responses")
Returns
{:ok, request_id}on success{:error, reason}on failure
Retrieves the results of a completed deep research request.
Parameters
request_id: The ID of the completed research requestendpoint: Optional custom endpoint URL
Returns
{:ok, result_map}containing the research findings and metadata{:error, reason}on failure