glopenai

test Package Version Hex Docs

A sans-IO OpenAI API client for Gleam, ported from the Rust async-openai library.

Sans-IO means glopenai builds HTTP requests and parses HTTP responses, but never sends or receives anything itself. You bring your own HTTP client (gleam_httpc, hackney, fetch, etc.) and call it between the request and response functions. This makes the library transport-agnostic, easy to test, and free of runtime dependencies beyond the Gleam standard library.

Installation

gleam add glopenai

Quick start

import gleam/httpc
import gleam/io
import gleam/option.{Some}
import glopenai/chat
import glopenai/config

pub fn main() {
  let cfg = config.new(api_key: "sk-...")

  // 1. Build the request
  let request =
    chat.new_create_request(model: "gpt-4o-mini", messages: [
      chat.system_message("You are a helpful assistant."),
      chat.user_message("What is the capital of France?"),
    ])
    |> chat.with_max_completion_tokens(256)

  let http_request = chat.create_request(cfg, request)

  // 2. Send it with any HTTP client
  let assert Ok(http_response) = httpc.send(http_request)

  // 3. Parse the response
  let assert Ok(response) = chat.create_response(http_response)

  case response.choices {
    [choice, ..] ->
      case choice.message.content {
        Some(content) -> io.println(content)
        _ -> Nil
      }
    _ -> Nil
  }
}

Every API module follows the same *_request / *_response pattern. Multipart endpoints (file uploads, upload parts) return Request(BitArray) instead of Request(String) – use httpc.send_bits for those.

Origin

glopenai is based on async-openai, the most complete OpenAI client library for Rust. Types, field names, and API coverage were mapped as faithfully as possible from the Rust source, adapted to Gleam conventions (custom types instead of serde enums, Result instead of panics, builder functions instead of derive_builder macros).

API coverage

Available now

ModuleAPIHighlights
chatChat CompletionsMessages, tools, streaming, web search, structured output
responseResponses API25 input item types, 20 output types, 48 stream events, tools
modelModelsList, retrieve, delete
embeddingEmbeddingsString, array, token, and multi-input variants
moderationModerationsText, image, and multi-modal input
imageImage Generation8 sizes, 5 models, URL and base64 responses
audioAudio (TTS)13 voices, 6 output formats
fileFilesList, retrieve, delete, content, and multipart upload
completionCompletions (legacy)4 prompt variants, logprobs, streaming
fine_tuningFine-tuningJobs, events, checkpoints, DPO/reinforcement methods
batchBatch APICreate, retrieve, cancel, list, JSONL helpers
vector_storeVector StoresStores, files, batches, search with filters, chunking strategies
chatkitChatKitSessions, threads, items
uploadUploadsCreate, add part (multipart), complete, cancel
webhookWebhooks15 event types, HMAC-SHA256 signature verification
configConfigurationOpenAI and Azure endpoints, custom headers

Not yet ported

ModuleNotes
AssistantsDeprecated upstream but still used; large surface area
Video (Sora)Create, edit, extend, remix
ContainersContainer management
SkillsSkill definitions
EvalsEvaluation framework
AdminUsers, projects, API keys, audit logs, invites, roles, usage, and more
RealtimeWebSocket-based; needs a different transport abstraction
Audio transcription/translationWaiting on multipart integration
Image edit/variationWaiting on multipart integration
SSE parsing helperPer-module stream parsers exist; generic helper planned

Compatible APIs

glopenai works with any API that follows the OpenAI HTTP contract. Set config.with_api_base to point at a different endpoint:

let cfg =
  config.new(api_key: "...")
  |> config.with_api_base("http://localhost:11434/v1")  // Ollama

Azure OpenAI is also supported via config.new_azure.

Dependencies

gleam_stdlib >= 0.44.0
gleam_json   >= 3.1.0
gleam_http   >= 4.3.0

No HTTP client dependency. Bring your own.

Development

gleam build   # Compile
gleam test    # Run the test suite

Runnable examples live in dev/example/. Run them with:

OPENAI_API_KEY=sk-... gleam run -m example/chat

License

MIT

Search Document