LLMAgent.Providers.OpenAI (llm_agent v0.2.0)
View SourceOpenAI provider implementation for LLMAgent.
This module implements the LLM provider interface for OpenAI, handling API calls, response parsing, and error handling specific to OpenAI's API.
Summary
Functions
Sends a completion request to OpenAI's API.
Generates embeddings for the provided text using OpenAI's API.
Functions
Sends a completion request to OpenAI's API.
Parameters
params
- A map with parameters for the request:model
- The model to use (e.g., "gpt-4")messages
- The conversation historytools
- Available tools for function callingtemperature
- Controls randomness (0.0 to 2.0)max_tokens
- Maximum tokens to generate
Returns
{:ok, response}
on success, {:error, reason}
on failure.
Examples
iex> params = %{
...> model: "gpt-4",
...> messages: [%{role: "user", content: "Hello"}],
...> max_tokens: 500
...> }
iex> {:ok, response} = LLMAgent.Providers.OpenAI.completion(params)
iex> is_map(response)
true
Generates embeddings for the provided text using OpenAI's API.
Parameters
params
- A map with parameters for the request:model
- The embedding model to use (e.g., "text-embedding-ada-002")input
- The text to generate embeddings for
Returns
{:ok, embeddings}
on success, {:error, reason}
on failure.
Examples
iex> params = %{
...> model: "text-embedding-ada-002",
...> input: "Hello, world!"
...> }
iex> {:ok, embeddings} = LLMAgent.Providers.OpenAI.embedding(params)
iex> is_list(embeddings)
true