Changelog
View SourceThe format is based on Keep a Changelog, and this project adheres to Semantic Versioning.
0.4.0
Major refactoring to make the codebase more modular and maintainable, and simplify the API.
Breaking Changes
- Removed
:client
option fromcreate/1
- Removed
Responses.Helpers
module - cost is now calculated automatically for models with known pricing - Removed image helpers for now until a better API can be designed
- Removed
delete/2
,get/2
, andlist_input_items/2
- use the low-levelrequest/1
function instead - Removed
parse/2
- usecreate/1
with a:schema
option instead - Removed
collect_stream/1
- Changed schema syntax - replaced
Schema.object/2
,Schema.string/1
etc. with simple maps
Added
create/2
can be used to follow up on a previous responsecreate!/1
andcreate!/2
raise an error if the response is not successfulrun/2
andrun!/2
functions for automatic function calling loops- Takes a list of available functions and automatically handles function calls
- Continues conversation until a final response without function calls is received
- Returns a list of all responses generated during the conversation
- Supports both map and keyword list function definitions
list_models/0
andlist_models/1
to return the list of available OpenAI models:schema
option tocreate/1
to specify the schema of the response bodySchema.build_output/1
to build the output schema for the response bodySchema.build_function/3
to build the function schema for tool calling:stream
callback function option tocreate/1
that will be called on every eventjson_events/1
helper toResponses.Stream
for streaming JSON events with Jaxon
Changed
create/1
now returns aResponse
struct containing the generated response body, text, parsed JSON, and cost informationtext_deltas/1
is moved toResponses.Stream
0.3.0
Breaking Changes
- Removed positional parameters from
create
,stream
, andparse
functions. Use keyword lists or maps for parameters instead.
Instead of OpenAIResponses.create("model", "prompt")
, use OpenAIResponses.create(model: "model", prompt: "prompt")
.
Changed parse response format. The structure of
parse
return has changed: instead of the parsed response, it now returns a map with the following keys:- parsed: the parsed result
- raw_response: the raw response received from OpenAI, which can be used to e.g. calculate the cost
- token_usage: information about the token usage
Removed Config module. Configuration handling has been streamlined. (Commit: 88b812e)
Removed parse_stream function. This function was not working as intended, and is removed in this release. (Commit: 9723ccb)
Removed explicit instructions message in parse. This should now be handled by the developer. (Commit: f8addc0)
Added
- Cost calculations for various models including 4.1 models and chatgpt-4o-latest. (Commits: 2985d94, 70d68ad, 4f2cdc3)
- Option to log requests for debugging or monitoring purposes. (Commit: d4d56b5)
Changed
- Increased default timeout to 60 seconds from a previous value. (Commit: 4a01b66)
Fixed
- Fixed tests across multiple commits to ensure functionality and improve coverage. (Commits: c78858c, 18c535a, 804522e, c594688, d637a73)
Documentation
- Updated documentation to reflect new features, response usage, and tutorial content. (Commits: d1bc204, f07d990, 03031e6, a4ac509)