View Source API Reference llm_composer v0.8.0
Modules
LlmComposer
is responsible for interacting with a language model to perform chat-related operations,
such as running completions and executing functions based on the responses. The module provides
functionality to handle user messages, generate responses, and automatically execute functions as needed.
Cache behaviour to use other implementations for cache mod.
Basic ETS cache.
Custom errors in here
Defines a struct for representing a callable function within the context of a language model interaction.
Helper struct for function call actions.
Provides helper functions for the LlmComposer
module, particularly for managing
function calls and handling language model responses.
Helper mod for setup the Tesla http client and its options
Module to parse and easily handle llm responses.
Module that represents an arbitrary message for any LLM.
Behaviour definition for LLM models.
Model implementation for Amazon Bedrock.
Model implementation for Ollama
Model implementation for OpenAI
Model implementation for OpenRouter
Defines the settings for configuring chat interactions with a language model.