Examples demonstrating Ollixir features, designed to run against a real Ollama server.

Run everything

./examples/run_all.sh

The runner will:

  • Verify the Ollama server is reachable (default: http://localhost:11434).
  • Pull required models if the ollama CLI is available (llama3.2, nomic-embed-text, llava, deepseek-r1:1.5b, codellama:7b-code).
  • Run every example (except the MCP stdio server), including the interactive chat history and LiveView module compile.
  • Skip cloud/web examples if OLLAMA_API_KEY is not set (invalid keys report 401/403).
  • Multimodal examples require a compatible model (llava). If not installed, they print a prompt.
  • Thinking examples use deepseek-r1:1.5b (supports think).
  • Skip custom model creation unless RUN_CREATE_MODEL=1 (it writes a new local model).

Run ./examples/run_all.sh --help to see optional skips.

Run a single example

elixir examples/basic/chat.exs

Example map

Basic operations

Streaming

Conversations

Tools / Function Calling

Structured Output

Multimodal

Thinking Mode

Embeddings

Model Management

Web (Cloud API)

MCP (Model Context Protocol)

  • mcp_server.exs - MCP stdio server for web_search/web_fetch (requires API key) Works with any MCP client that supports stdio (Cursor, Claude Desktop, Cline, Continue, Open WebUI).

HuggingFace Integration

Advanced Patterns

Requirements

  • Elixir 1.15+
  • Ollama installed and running (ollama serve)
  • Models pulled (ollama pull llama3.2, ollama pull nomic-embed-text, ollama pull llava, ollama pull deepseek-r1:1.5b, ollama pull codellama:7b-code)
  • For web examples: create a key at https://ollama.com/settings/keys and set OLLAMA_API_KEY

To pull everything at once:

./examples/install_models.sh