gleamstral
A Gleam client library for the Mistral AI API, providing type-safe access to Mistral’s powerful language models, embeddings, and agents.
Overview
Gleamstral enables Gleam applications to seamlessly integrate with Mistral AI’s capabilities, including:
- Chat completions with various Mistral models (Large, Small, Ministral, etc.)
- Function/tool calling support
- Embeddings generation
- Agent API integration
- Image analysis with Pixtral models
Further documentation can be found at https://hexdocs.pm/gleamstral.
Installation
gleam add gleamstral
Getting an API key
You can get an API key for free from Mistral La Plateforme.
Quick Example
import gleam/io
import gleam/httpc
import gleam/list
import gleamstral/client
import gleamstral/chat
import gleamstral/message
import gleamstral/model
pub fn main() {
// Create a Mistral client with your API key
let client = client.new("your-mistral-api-key")
// Set up a chat with Mistral Large model
let chat_client = chat.new(client) |> chat.set_temperature(0.7)
// Define a list of one or many messages to send to the model
let messages = [
message.UserMessage(message.TextContent("Explain brievly what is Gleam")),
]
// Send the request and get a response from the model
let assert Ok(response) =
chat_client
|> chat.complete_request(model.MistralSmall, messages)
|> httpc.send
let assert Ok(response) = chat.handle_response(response)
let assert Ok(choice) = list.first(response.choices)
let assert message.AssistantMessage(content, _, _) = choice.message
io.println("Response: " <> content) // "Gleam is a very cool language [...]"
}
Key Features
Chat Completions with Vision
Gleamstral supports multimodal inputs, allowing you to send both text and images to vision-enabled models like Pixtral:
// Create a message with an image
let messages = [
message.UserMessage(
message.MultiContent([
message.Text("What's in this image?"),
message.ImageUrl("https://gleam.run/images/lucy/lucy.svg")
])
)
]
// Use a Pixtral model for image analysis
let assert Ok(response) =
chat.new(client)
|> chat.complete_request(model.PixtralLarge, messages)
|> httpc.send
let assert Ok(response) = chat.handle_response(response)
// Get the first choice from the response
let assert Ok(choice) = list.first(response.choices)
let assert message.AssistantMessage(content, _, _) = choice.message
io.println("Response: " <> content) // "This is a picture of the cute star lucy"
Agent API
Access Mistral’s Agent API to utilize pre-configured agents for specific tasks:
// Get your agent ID from the Mistral console
let agent_id = "your-agent-id"
// Call the agent with your agent ID and messages
let assert Ok(response) =
agent.new(client)
|> agent.complete_request(agent_id, messages)
|> httpc.send
let assert Ok(response) = agent.handle_response(response)
Embeddings Generation
Generate vector embeddings for text to enable semantic search, clustering, or other vector operations:
// Generate embeddings for a text input
let assert Ok(response) =
embeddings.new(client)
|> embeddings.create_request(model.MistralEmbed, ["Your text to embed"])
|> httpc.send
let assert Ok(response) = embeddings.handle_response(response)
Tool/Function Calling
Define tools that the model can use to call functions in your application:
// Define a tool
let weather_tool = tool.new(
"get_weather",
"Get the current weather in a location",
// Define the parameters schema
[#("location", String, True, "The city and country, e.g. Seoul, South Korea")]
)
// Create a chat client with the tool
let assert Ok(response) =
chat.new(client)
|> chat.set_tools([weather_tool])
|> chat.complete_request(model.MistralSmall, messages)
|> httpc.send
let assert Ok(response) = chat.handle_response(response)
Examples
The examples/
directory contains several ready-to-use examples demonstrating the library’s capabilities:
agent.gleam
: Shows how to use the Mistral Agent APIbasic_tool.gleam
: Demonstrates tool/function calling functionalityembeddings.gleam
: Illustrates how to generate and use embeddingsimage_analysis.gleam
: Shows how to perform image analysis with Pixtral modelsjson_object.gleam
: Example of structured JSON output from the model
To run any example:
cd examples
gleam run -m example_name # e.g., gleam run -m agent
Note: You’ll need to set your Mistral API key in an .env
file or as an environment variable.
Roadmap
- Decouple the HTTP client from the library
- Add support and example for streaming responses
- Add support for structured outputs (JSON, JSON Schema, etc.)
- Add more tests and documentation
Contributing
Contributions are welcome! Please open an issue or submit a pull request.
License
This project is licensed under the MIT License.