View Source ExOpenAI.Threads (ex_openai.ex v1.8.0)
Modules for interacting with the threads
group of OpenAI APIs
API Reference: https://platform.openai.com/docs/api-reference/threads
Summary
Functions
Cancels a run that is in_progress
.
Create a message.
Create a run.
Create a thread.
Create a thread and run it in one request.
Deletes a message.
Delete a thread.
Retrieve a message.
Retrieves a run.
Retrieves a run step.
Retrieves a thread.
Returns a list of messages for a given thread.
Returns a list of run steps belonging to a run.
Returns a list of runs belonging to a thread.
Modifies a message.
Modifies a run.
Modifies a thread.
When a run has the status: "requires_action"
and required_action.type
is submit_tool_outputs
, this endpoint can be used to submit the outputs from the tool calls once they're all completed. All outputs must be submitted in a single request.
Functions
@spec cancel_run(String.t(), String.t(), base_url: String.t(), openai_organization_key: String.t(), openai_api_key: String.t() ) :: {:ok, ExOpenAI.Components.RunObject.t()} | {:error, any()}
Cancels a run that is in_progress
.
Endpoint: https://api.openai.com/v1/threads/{thread_id}/runs/{run_id}/cancel
Method: POST
Docs: https://platform.openai.com/docs/api-reference/threads
Required Arguments:
thread_id
run_id
Optional Arguments:
openai_api_key
: "OpenAI API key to pass directly. If this is specified, it will override theapi_key
config value."openai_organization_key
: "OpenAI API key to pass directly. If this is specified, it will override theorganization_key
config value."base_url
: "Which API endpoint to use as base, defaults to https://api.openai.com/v1"
@spec create_message(String.t(), [map()] | String.t(), :assistant | :user, base_url: String.t(), openai_organization_key: String.t(), openai_api_key: String.t(), metadata: ExOpenAI.Components.Metadata.t(), attachments: [%{file_id: String.t(), tools: [map()]}], stream_to: (... -> any()) | pid() ) :: {:ok, ExOpenAI.Components.MessageObject.t()} | {:error, any()}
Create a message.
Endpoint: https://api.openai.com/v1/threads/{thread_id}/messages
Method: POST
Docs: https://platform.openai.com/docs/api-reference/threads
Required Arguments:
thread_id
content
:role
: The role of the entity that is creating the message. Allowed values include:user
: Indicates the message is sent by an actual user and should be used in most cases to represent user-generated messages.assistant
: Indicates the message is generated by the assistant. Use this value to insert messages from the assistant into the conversation.
Optional Arguments:
stream_to
: "PID or function of where to stream content to"attachments
: "A list of files attached to the message, and the tools they should be added to."metadata
: ""openai_api_key
: "OpenAI API key to pass directly. If this is specified, it will override theapi_key
config value."openai_organization_key
: "OpenAI API key to pass directly. If this is specified, it will override theorganization_key
config value."base_url
: "Which API endpoint to use as base, defaults to https://api.openai.com/v1"
@spec create_run(String.t(), String.t(), base_url: String.t(), openai_organization_key: String.t(), openai_api_key: String.t(), truncation_strategy: nil | ExOpenAI.Components.TruncationObject.t(), top_p: float(), tools: [map()], tool_choice: nil | ExOpenAI.Components.AssistantsApiToolChoiceOption.t(), temperature: float(), stream: boolean(), response_format: ExOpenAI.Components.AssistantsApiResponseFormatOption.t(), reasoning_effort: ExOpenAI.Components.ReasoningEffort.t(), parallel_tool_calls: ExOpenAI.Components.ParallelToolCalls.t(), model: ExOpenAI.Components.AssistantSupportedModels.t() | String.t(), metadata: ExOpenAI.Components.Metadata.t(), max_prompt_tokens: integer(), max_completion_tokens: integer(), instructions: String.t(), additional_messages: [ExOpenAI.Components.CreateMessageRequest.t()], additional_instructions: String.t(), "include[]": list(), stream_to: (... -> any()) | pid() ) :: {:ok, ExOpenAI.Components.RunObject.t()} | {:error, any()}
Create a run.
Endpoint: https://api.openai.com/v1/threads/{thread_id}/runs
Method: POST
Docs: https://platform.openai.com/docs/api-reference/threads
Required Arguments:
thread_id
assistant_id
: The ID of the assistant to use to execute this run.
Optional Arguments:
stream_to
: "PID or function of where to stream content to"include[]
additional_instructions
: "Appends additional instructions at the end of the instructions for the run. This is useful for modifying the behavior on a per-run basis without overriding other instructions."additional_messages
: "Adds additional messages to the thread before creating the run."instructions
: "Overrides the instructions of the assistant. This is useful for modifying the behavior on a per-run basis."max_completion_tokens
: "The maximum number of completion tokens that may be used over the course of the run. The run will make a best effort to use only the number of completion tokens specified, across multiple turns of the run. If the run exceeds the number of completion tokens specified, the run will end with statusincomplete
. Seeincomplete_details
for more info.\n"max_prompt_tokens
: "The maximum number of prompt tokens that may be used over the course of the run. The run will make a best effort to use only the number of prompt tokens specified, across multiple turns of the run. If the run exceeds the number of prompt tokens specified, the run will end with statusincomplete
. Seeincomplete_details
for more info.\n"metadata
: ""model
: "The ID of the Model to be used to execute this run. If a value is provided here, it will override the model associated with the assistant. If not, the model associated with the assistant will be used."parallel_tool_calls
: ""reasoning_effort
: ""response_format
: ""stream
: "Iftrue
, returns a stream of events that happen during the Run as server-sent events, terminating when the Run enters a terminal state with adata: [DONE]
message.\n"temperature
: "What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic.\n"
Example: 1
tool_choice
: niltools
: "Override the tools the assistant can use for this run. This is useful for modifying the behavior on a per-run basis."top_p
: "An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10% probability mass are considered.\n\nWe generally recommend altering this or temperature but not both.\n"
Example: 1
truncation_strategy
: nilopenai_api_key
: "OpenAI API key to pass directly. If this is specified, it will override theapi_key
config value."openai_organization_key
: "OpenAI API key to pass directly. If this is specified, it will override theorganization_key
config value."base_url
: "Which API endpoint to use as base, defaults to https://api.openai.com/v1"
@spec create_thread( base_url: String.t(), openai_organization_key: String.t(), openai_api_key: String.t() ) :: {:ok, ExOpenAI.Components.ThreadObject.t()} | {:error, any()}
Create a thread.
Endpoint: https://api.openai.com/v1/threads
Method: POST
Docs: https://platform.openai.com/docs/api-reference/threads
Required Arguments:
Optional Arguments:
openai_api_key
: "OpenAI API key to pass directly. If this is specified, it will override theapi_key
config value."openai_organization_key
: "OpenAI API key to pass directly. If this is specified, it will override theorganization_key
config value."base_url
: "Which API endpoint to use as base, defaults to https://api.openai.com/v1"
@spec create_thread_and_run(String.t(), base_url: String.t(), openai_organization_key: String.t(), openai_api_key: String.t(), truncation_strategy: nil | ExOpenAI.Components.TruncationObject.t(), top_p: float(), tools: [map()], tool_resources: %{ code_interpreter: %{file_ids: [String.t()]}, file_search: %{vector_store_ids: [String.t()]} }, tool_choice: nil | ExOpenAI.Components.AssistantsApiToolChoiceOption.t(), thread: ExOpenAI.Components.CreateThreadRequest.t(), temperature: float(), stream: boolean(), response_format: ExOpenAI.Components.AssistantsApiResponseFormatOption.t(), parallel_tool_calls: ExOpenAI.Components.ParallelToolCalls.t(), model: (:"gpt-3.5-turbo-16k-0613" | :"gpt-3.5-turbo-0125" | :"gpt-3.5-turbo-1106" | :"gpt-3.5-turbo-0613" | :"gpt-3.5-turbo-16k" | :"gpt-3.5-turbo" | :"gpt-4-32k-0613" | :"gpt-4-32k-0314" | :"gpt-4-32k" | :"gpt-4-0613" | :"gpt-4-0314" | :"gpt-4" | :"gpt-4-vision-preview" | :"gpt-4-1106-preview" | :"gpt-4-turbo-preview" | :"gpt-4-0125-preview" | :"gpt-4-turbo-2024-04-09" | :"gpt-4-turbo" | :"gpt-4.5-preview-2025-02-27" | :"gpt-4.5-preview" | :"gpt-4o-mini-2024-07-18" | :"gpt-4o-mini" | :"gpt-4o-2024-05-13" | :"gpt-4o-2024-08-06" | :"gpt-4o-2024-11-20" | :"gpt-4o") | String.t(), metadata: ExOpenAI.Components.Metadata.t(), max_prompt_tokens: integer(), max_completion_tokens: integer(), instructions: String.t(), stream_to: (... -> any()) | pid() ) :: {:ok, ExOpenAI.Components.RunObject.t()} | {:error, any()}
Create a thread and run it in one request.
Endpoint: https://api.openai.com/v1/threads/runs
Method: POST
Docs: https://platform.openai.com/docs/api-reference/threads
Required Arguments:
assistant_id
: The ID of the assistant to use to execute this run.
Optional Arguments:
stream_to
: "PID or function of where to stream content to"instructions
: "Override the default system message of the assistant. This is useful for modifying the behavior on a per-run basis."max_completion_tokens
: "The maximum number of completion tokens that may be used over the course of the run. The run will make a best effort to use only the number of completion tokens specified, across multiple turns of the run. If the run exceeds the number of completion tokens specified, the run will end with statusincomplete
. Seeincomplete_details
for more info.\n"max_prompt_tokens
: "The maximum number of prompt tokens that may be used over the course of the run. The run will make a best effort to use only the number of prompt tokens specified, across multiple turns of the run. If the run exceeds the number of prompt tokens specified, the run will end with statusincomplete
. Seeincomplete_details
for more info.\n"metadata
: ""model
: "The ID of the Model to be used to execute this run. If a value is provided here, it will override the model associated with the assistant. If not, the model associated with the assistant will be used."parallel_tool_calls
: ""response_format
: ""stream
: "Iftrue
, returns a stream of events that happen during the Run as server-sent events, terminating when the Run enters a terminal state with adata: [DONE]
message.\n"temperature
: "What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic.\n"
Example: 1
thread
: ""tool_choice
: niltool_resources
: "A set of resources that are used by the assistant's tools. The resources are specific to the type of tool. For example, thecode_interpreter
tool requires a list of file IDs, while thefile_search
tool requires a list of vector store IDs.\n"tools
: "Override the tools the assistant can use for this run. This is useful for modifying the behavior on a per-run basis."top_p
: "An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10% probability mass are considered.\n\nWe generally recommend altering this or temperature but not both.\n"
Example: 1
truncation_strategy
: nilopenai_api_key
: "OpenAI API key to pass directly. If this is specified, it will override theapi_key
config value."openai_organization_key
: "OpenAI API key to pass directly. If this is specified, it will override theorganization_key
config value."base_url
: "Which API endpoint to use as base, defaults to https://api.openai.com/v1"
@spec delete_message(String.t(), String.t(), base_url: String.t(), openai_organization_key: String.t(), openai_api_key: String.t() ) :: {:ok, ExOpenAI.Components.DeleteMessageResponse.t()} | {:error, any()}
Deletes a message.
Endpoint: https://api.openai.com/v1/threads/{thread_id}/messages/{message_id}
Method: DELETE
Docs: https://platform.openai.com/docs/api-reference/threads
Required Arguments:
thread_id
message_id
Optional Arguments:
openai_api_key
: "OpenAI API key to pass directly. If this is specified, it will override theapi_key
config value."openai_organization_key
: "OpenAI API key to pass directly. If this is specified, it will override theorganization_key
config value."base_url
: "Which API endpoint to use as base, defaults to https://api.openai.com/v1"
@spec delete_thread(String.t(), base_url: String.t(), openai_organization_key: String.t(), openai_api_key: String.t() ) :: {:ok, ExOpenAI.Components.DeleteThreadResponse.t()} | {:error, any()}
Delete a thread.
Endpoint: https://api.openai.com/v1/threads/{thread_id}
Method: DELETE
Docs: https://platform.openai.com/docs/api-reference/threads
Required Arguments:
thread_id
Optional Arguments:
openai_api_key
: "OpenAI API key to pass directly. If this is specified, it will override theapi_key
config value."openai_organization_key
: "OpenAI API key to pass directly. If this is specified, it will override theorganization_key
config value."base_url
: "Which API endpoint to use as base, defaults to https://api.openai.com/v1"
@spec get_message(String.t(), String.t(), base_url: String.t(), openai_organization_key: String.t(), openai_api_key: String.t() ) :: {:ok, ExOpenAI.Components.MessageObject.t()} | {:error, any()}
Retrieve a message.
Endpoint: https://api.openai.com/v1/threads/{thread_id}/messages/{message_id}
Method: GET
Docs: https://platform.openai.com/docs/api-reference/threads
Required Arguments:
thread_id
message_id
Optional Arguments:
openai_api_key
: "OpenAI API key to pass directly. If this is specified, it will override theapi_key
config value."openai_organization_key
: "OpenAI API key to pass directly. If this is specified, it will override theorganization_key
config value."base_url
: "Which API endpoint to use as base, defaults to https://api.openai.com/v1"
@spec get_run(String.t(), String.t(), base_url: String.t(), openai_organization_key: String.t(), openai_api_key: String.t() ) :: {:ok, ExOpenAI.Components.RunObject.t()} | {:error, any()}
Retrieves a run.
Endpoint: https://api.openai.com/v1/threads/{thread_id}/runs/{run_id}
Method: GET
Docs: https://platform.openai.com/docs/api-reference/threads
Required Arguments:
thread_id
run_id
Optional Arguments:
openai_api_key
: "OpenAI API key to pass directly. If this is specified, it will override theapi_key
config value."openai_organization_key
: "OpenAI API key to pass directly. If this is specified, it will override theorganization_key
config value."base_url
: "Which API endpoint to use as base, defaults to https://api.openai.com/v1"
@spec get_run_step(String.t(), String.t(), String.t(), base_url: String.t(), openai_organization_key: String.t(), openai_api_key: String.t(), "include[]": list(), stream_to: (... -> any()) | pid() ) :: {:ok, ExOpenAI.Components.RunStepObject.t()} | {:error, any()}
Retrieves a run step.
Endpoint: https://api.openai.com/v1/threads/{thread_id}/runs/{run_id}/steps/{step_id}
Method: GET
Docs: https://platform.openai.com/docs/api-reference/threads
Required Arguments:
thread_id
run_id
step_id
Optional Arguments:
stream_to
: "PID or function of where to stream content to"include[]
openai_api_key
: "OpenAI API key to pass directly. If this is specified, it will override theapi_key
config value."openai_organization_key
: "OpenAI API key to pass directly. If this is specified, it will override theorganization_key
config value."base_url
: "Which API endpoint to use as base, defaults to https://api.openai.com/v1"
@spec get_thread(String.t(), base_url: String.t(), openai_organization_key: String.t(), openai_api_key: String.t() ) :: {:ok, ExOpenAI.Components.ThreadObject.t()} | {:error, any()}
Retrieves a thread.
Endpoint: https://api.openai.com/v1/threads/{thread_id}
Method: GET
Docs: https://platform.openai.com/docs/api-reference/threads
Required Arguments:
thread_id
Optional Arguments:
openai_api_key
: "OpenAI API key to pass directly. If this is specified, it will override theapi_key
config value."openai_organization_key
: "OpenAI API key to pass directly. If this is specified, it will override theorganization_key
config value."base_url
: "Which API endpoint to use as base, defaults to https://api.openai.com/v1"
@spec list_messages(String.t(), base_url: String.t(), openai_organization_key: String.t(), openai_api_key: String.t(), run_id: String.t(), before: String.t(), after: String.t(), order: String.t(), limit: integer(), stream_to: (... -> any()) | pid() ) :: {:ok, ExOpenAI.Components.ListMessagesResponse.t()} | {:error, any()}
Returns a list of messages for a given thread.
Endpoint: https://api.openai.com/v1/threads/{thread_id}/messages
Method: GET
Docs: https://platform.openai.com/docs/api-reference/threads
Required Arguments:
thread_id
Optional Arguments:
stream_to
: "PID or function of where to stream content to"limit
order
after
before
run_id
openai_api_key
: "OpenAI API key to pass directly. If this is specified, it will override theapi_key
config value."openai_organization_key
: "OpenAI API key to pass directly. If this is specified, it will override theorganization_key
config value."base_url
: "Which API endpoint to use as base, defaults to https://api.openai.com/v1"
@spec list_run_steps(String.t(), String.t(), base_url: String.t(), openai_organization_key: String.t(), openai_api_key: String.t(), "include[]": list(), before: String.t(), after: String.t(), order: String.t(), limit: integer(), stream_to: (... -> any()) | pid() ) :: {:ok, ExOpenAI.Components.ListRunStepsResponse.t()} | {:error, any()}
Returns a list of run steps belonging to a run.
Endpoint: https://api.openai.com/v1/threads/{thread_id}/runs/{run_id}/steps
Method: GET
Docs: https://platform.openai.com/docs/api-reference/threads
Required Arguments:
thread_id
run_id
Optional Arguments:
stream_to
: "PID or function of where to stream content to"limit
order
after
before
include[]
openai_api_key
: "OpenAI API key to pass directly. If this is specified, it will override theapi_key
config value."openai_organization_key
: "OpenAI API key to pass directly. If this is specified, it will override theorganization_key
config value."base_url
: "Which API endpoint to use as base, defaults to https://api.openai.com/v1"
@spec list_runs(String.t(), base_url: String.t(), openai_organization_key: String.t(), openai_api_key: String.t(), before: String.t(), after: String.t(), order: String.t(), limit: integer(), stream_to: (... -> any()) | pid() ) :: {:ok, ExOpenAI.Components.ListRunsResponse.t()} | {:error, any()}
Returns a list of runs belonging to a thread.
Endpoint: https://api.openai.com/v1/threads/{thread_id}/runs
Method: GET
Docs: https://platform.openai.com/docs/api-reference/threads
Required Arguments:
thread_id
Optional Arguments:
stream_to
: "PID or function of where to stream content to"limit
order
after
before
openai_api_key
: "OpenAI API key to pass directly. If this is specified, it will override theapi_key
config value."openai_organization_key
: "OpenAI API key to pass directly. If this is specified, it will override theorganization_key
config value."base_url
: "Which API endpoint to use as base, defaults to https://api.openai.com/v1"
@spec modify_message(String.t(), String.t(), base_url: String.t(), openai_organization_key: String.t(), openai_api_key: String.t() ) :: {:ok, ExOpenAI.Components.MessageObject.t()} | {:error, any()}
Modifies a message.
Endpoint: https://api.openai.com/v1/threads/{thread_id}/messages/{message_id}
Method: POST
Docs: https://platform.openai.com/docs/api-reference/threads
Required Arguments:
thread_id
message_id
Optional Arguments:
openai_api_key
: "OpenAI API key to pass directly. If this is specified, it will override theapi_key
config value."openai_organization_key
: "OpenAI API key to pass directly. If this is specified, it will override theorganization_key
config value."base_url
: "Which API endpoint to use as base, defaults to https://api.openai.com/v1"
@spec modify_run(String.t(), String.t(), base_url: String.t(), openai_organization_key: String.t(), openai_api_key: String.t() ) :: {:ok, ExOpenAI.Components.RunObject.t()} | {:error, any()}
Modifies a run.
Endpoint: https://api.openai.com/v1/threads/{thread_id}/runs/{run_id}
Method: POST
Docs: https://platform.openai.com/docs/api-reference/threads
Required Arguments:
thread_id
run_id
Optional Arguments:
openai_api_key
: "OpenAI API key to pass directly. If this is specified, it will override theapi_key
config value."openai_organization_key
: "OpenAI API key to pass directly. If this is specified, it will override theorganization_key
config value."base_url
: "Which API endpoint to use as base, defaults to https://api.openai.com/v1"
@spec modify_thread(String.t(), base_url: String.t(), openai_organization_key: String.t(), openai_api_key: String.t() ) :: {:ok, ExOpenAI.Components.ThreadObject.t()} | {:error, any()}
Modifies a thread.
Endpoint: https://api.openai.com/v1/threads/{thread_id}
Method: POST
Docs: https://platform.openai.com/docs/api-reference/threads
Required Arguments:
thread_id
Optional Arguments:
openai_api_key
: "OpenAI API key to pass directly. If this is specified, it will override theapi_key
config value."openai_organization_key
: "OpenAI API key to pass directly. If this is specified, it will override theorganization_key
config value."base_url
: "Which API endpoint to use as base, defaults to https://api.openai.com/v1"
submit_tool_ouputs_to_run(thread_id, run_id, tool_outputs, opts \\ [])
View Source@spec submit_tool_ouputs_to_run( String.t(), String.t(), [%{output: String.t(), tool_call_id: String.t()}], base_url: String.t(), openai_organization_key: String.t(), openai_api_key: String.t(), stream: boolean(), stream_to: (... -> any()) | pid() ) :: {:ok, ExOpenAI.Components.RunObject.t()} | {:error, any()}
When a run has the status: "requires_action"
and required_action.type
is submit_tool_outputs
, this endpoint can be used to submit the outputs from the tool calls once they're all completed. All outputs must be submitted in a single request.
Endpoint: https://api.openai.com/v1/threads/{thread_id}/runs/{run_id}/submit_tool_outputs
Method: POST
Docs: https://platform.openai.com/docs/api-reference/threads
Required Arguments:
thread_id
run_id
tool_outputs
: A list of tools for which the outputs are being submitted.
Optional Arguments:
stream_to
: "PID or function of where to stream content to"stream
: "Iftrue
, returns a stream of events that happen during the Run as server-sent events, terminating when the Run enters a terminal state with adata: [DONE]
message.\n"openai_api_key
: "OpenAI API key to pass directly. If this is specified, it will override theapi_key
config value."openai_organization_key
: "OpenAI API key to pass directly. If this is specified, it will override theorganization_key
config value."base_url
: "Which API endpoint to use as base, defaults to https://api.openai.com/v1"