View Source Omni.Providers.Ollama (Omni v0.1.1)

Provider implementation for Ollama, using the Ollama Chat API. Use this Provider to chat with pretty much any local and open model (Llama 3, Mistral, Gemma, and many more).

Base URL

By default the Ollama Provider uses the base URL of "http://localhost:11434/api". If you need to change this, pass the :base_url option to Omni.init/2:

iex> Omni.init(:ollama, base_url: "https://ollama.mydomain.com/api")
%Omni.Provider{mod: Omni.Providers.Ollama, req: %Req.Request{}}

Summary

Functions

Returns the schema for this Provider.

Functions

Returns the schema for this Provider.

Schema

  • :model (String.t/0) - Required. The ollama model name.

  • :messages (list of map/0) - Required. List of messages - used to keep a chat memory.

  • :format (String.t/0) - Set the expected format of the response (json).

  • :stream (boolean/0) - Whether to stream the response. The default value is false.

  • :keep_alive - How long to keep the model loaded.

  • :options - Additional advanced model parameters.