This guide walks you through your first steps with the Ollixir Elixir client.
Prerequisites
Install Ollama
Download Ollama from https://ollama.com/download.
Start Ollama Server
ollama servePull a Model
ollama pull llama3.2Browse models at https://ollama.com/search.
For examples, also pull:
ollama pull nomic-embed-text ollama pull llava ollama pull deepseek-r1:1.5b
Add to Your Project
# mix.exs
def deps do
[{:ollixir, "~> 0.1.1"}]
endYour First Chat
# Start iex -S mix
iex> client = Ollixir.init()
iex> {:ok, response} = Ollixir.chat(client,
...> model: "llama3.2",
...> messages: [%{role: "user", content: "Hello!"}]
...> )
iex> response["message"]["content"]
"Hello! How can I help you today?"Understanding Responses
Chat responses include:
| Field | Description |
|---|---|
message | The assistant's response |
done | Whether generation is complete |
model | Model used |
total_duration | Total time in nanoseconds |
eval_count | Tokens generated |
Typed Responses (Optional)
If you prefer response structs instead of maps:
{:ok, response} = Ollixir.chat(client,
model: "llama3.2",
messages: [%{role: "user", content: "Hello!"}],
response_format: :struct
)
IO.puts(response.message.content)Next Steps
- Streaming Guide - Real-time responses
- Tool Use Guide - Function calling
- Thinking Mode - Reasoning and chain-of-thought
- Embeddings - Semantic search and RAG
- Multimodal - Vision and image analysis
- Cloud API - Web search and cloud models
- Ollama Server Setup - Local and cloud configuration
- Examples - Working code samples