ReqLLM.Model (ReqLLM v1.0.0-rc.5)
View SourceRepresents an AI model configuration for ReqLLM.
This module provides a simplified model structure focused on essential fields needed for AI interactions: provider information, model name, and runtime parameters like temperature and token limits.
Examples
# Create a model with 3-tuple format (preferred)
{:ok, model} = ReqLLM.Model.from({:anthropic, "claude-3-5-sonnet", temperature: 0.7})
# Create a model with legacy 2-tuple format
{:ok, model} = ReqLLM.Model.from({:anthropic, model: "claude-3-5-sonnet", temperature: 0.7})
# Create a model from string specification
{:ok, model} = ReqLLM.Model.from("anthropic:claude-3-5-sonnet")
# Create a model directly
model = ReqLLM.Model.new(:anthropic, "claude-3-sonnet", temperature: 0.5, max_tokens: 1000)
Summary
Functions
Gets the default model for a provider spec.
Creates a model from various input formats.
Creates a model from input, raising an exception on error.
Loads full metadata from JSON files for enhanced model creation.
Creates a new model with the specified provider and model name.
Parses a provider string to a valid provider atom.
Validates that a model struct has required fields.
Returns a model with sensible defaults for missing metadata fields.
Loads a model with full metadata from the models_dev directory.
Types
@type limit() :: %{context: non_neg_integer(), output: non_neg_integer()}
@type modality() :: :text | :audio | :image | :video | :pdf
@type t() :: %ReqLLM.Model{ capabilities: (capabilities() | nil) | nil, cost: (cost() | nil) | nil, limit: (limit() | nil) | nil, max_retries: non_neg_integer() | nil, max_tokens: (non_neg_integer() | nil) | nil, modalities: (%{input: [modality()], output: [modality()]} | nil) | nil, model: String.t(), provider: atom() }
An AI model configuration
Functions
Gets the default model for a provider spec.
Falls back to the first available model if no default is specified.
Parameters
spec
- Provider spec struct with:default_model
and:models
fields
Returns
The default model string, or nil
if no models are available.
Examples
iex> spec = %{default_model: "gpt-4", models: %{"gpt-3.5" => %{}, "gpt-4" => %{}}}
iex> ReqLLM.Model.default_model(spec)
"gpt-4"
iex> spec = %{default_model: nil, models: %{"model-a" => %{}, "model-b" => %{}}}
iex> ReqLLM.Model.default_model(spec)
"model-a"
iex> spec = %{default_model: nil, models: %{}}
iex> ReqLLM.Model.default_model(spec)
nil
@spec from(t() | {atom(), String.t(), keyword()} | {atom(), keyword()} | String.t()) :: {:ok, t()} | {:error, term()}
Creates a model from various input formats.
Supports:
- Existing Model struct (returned as-is)
- 3-tuple format:
{provider, model, opts}
where provider is atom, model is string, opts is keyword list - 2-tuple format (legacy):
{provider, opts}
where provider is atom and opts is keyword list with:model
key - String format:
"provider:model"
(e.g.,"anthropic:claude-3-5-sonnet"
)
Examples
# From existing struct
{:ok, model} = ReqLLM.Model.from(%ReqLLM.Model{provider: :anthropic, model: "claude-3-5-sonnet"})
# From 3-tuple format (preferred)
{:ok, model} = ReqLLM.Model.from({:anthropic, "claude-3-5-sonnet", max_tokens: 1000})
# From 2-tuple format (legacy support)
{:ok, model} = ReqLLM.Model.from({:anthropic, model: "claude-3-5-sonnet", max_tokens: 1000,
capabilities: %{tool_call: true}})
# From string specification
{:ok, model} = ReqLLM.Model.from("anthropic:claude-3-sonnet")
Creates a model from input, raising an exception on error.
Examples
iex> model = ReqLLM.Model.from!("anthropic:claude-3-haiku-20240307")
iex> {model.provider, model.model, model.max_tokens}
{:anthropic, "claude-3-haiku-20240307", 4096}
Loads full metadata from JSON files for enhanced model creation.
Delegates to ReqLLM.Model.Metadata.load_full_metadata/1
.
Creates a new model with the specified provider and model name.
Parameters
provider
- The provider atom (e.g.,:anthropic
)model
- The model name string (e.g.,"gpt-4"
,"claude-3-sonnet"
)opts
- Optional keyword list of parameters
Options
:max_tokens
- Maximum tokens the model can generate (defaults to model's output limit):max_retries
- Maximum retry attempts (default: 3):limit
- Token limits map with:context
and:output
keys:modalities
- Input/output modalities map with lists of supported types:capabilities
- Model capabilities like:reasoning
,:tool_call
,:temperature
,:attachment
:cost
- Pricing information with:input
and:output
cost per 1K tokens Optional:cached_input
cost per 1K tokens (defaults to:input
rate if not specified)
Examples
iex> ReqLLM.Model.new(:anthropic, "claude-3-5-sonnet")
%ReqLLM.Model{provider: :anthropic, model: "claude-3-5-sonnet", max_tokens: nil, max_retries: 3}
iex> ReqLLM.Model.new(:anthropic, "claude-3-sonnet", max_tokens: 1000)
%ReqLLM.Model{provider: :anthropic, model: "claude-3-sonnet", max_tokens: 1000, max_retries: 3}
Parses a provider string to a valid provider atom.
Delegates to ReqLLM.Metadata.parse_provider/1
.
Validates that a model struct has required fields.
Examples
iex> model = %ReqLLM.Model{provider: :anthropic, model: "claude-3-5-sonnet", max_tokens: 4096, max_retries: 3}
iex> ReqLLM.Model.valid?(model)
true
iex> ReqLLM.Model.valid?(%{provider: :anthropic, model: "claude-3-5-sonnet"})
false
Returns a model with sensible defaults for missing metadata fields.
This helper fills in common defaults for models that don't have complete metadata.
Examples
iex> model = ReqLLM.Model.new(:anthropic, "claude-3-5-sonnet")
iex> ReqLLM.Model.with_defaults(model).capabilities
%{reasoning: false, tool_call: false, temperature: true, attachment: false}
Loads a model with full metadata from the models_dev directory.
This is useful for capability verification and other scenarios requiring detailed model information beyond what's needed for API calls.
Examples
{:ok, model_with_metadata} = ReqLLM.Model.with_metadata("anthropic:claude-3-sonnet")
model_with_metadata.cost
#=> %{"input" => 3.0, "output" => 15.0, ...}