Rag.Ai.Codex (rag v0.3.4)
View SourceCodex/OpenAI provider implementation using codex_sdk.
This provider supports text generation with advanced reasoning capabilities and structured output. It does NOT support embeddings - use Gemini for that.
Note: This module is only available when codex_sdk is installed.
Examples
# Basic text generation
provider = Codex.new(%{})
{:ok, response} = Codex.generate_text(provider, "Hello!", [])
# Structured output
{:ok, response} = Codex.generate_text(provider, "...", output_schema: schema)
Summary
Functions
Returns the cost per 1K tokens as {input_cost, output_cost} in USD.
Returns the maximum context window in tokens.
Returns whether this provider supports embeddings.
Returns whether this provider supports tool calling.
Types
Functions
Returns the cost per 1K tokens as {input_cost, output_cost} in USD.
@spec max_context_tokens() :: pos_integer()
Returns the maximum context window in tokens.
@spec supports_embeddings?() :: boolean()
Returns whether this provider supports embeddings.
@spec supports_tools?() :: boolean()
Returns whether this provider supports tool calling.