API Reference fnord v#0.7.19

View Source

Modules

When file or other input to too large for the model's context window, this module may be used to process the file in chunks. It automatically modifies the supplied agent prompt to include instructions for accumulating a response across multiple chunks based on the context (max context window tokens) parameter supplied by the model parameter.

This agent uses a combination of the reasoning features of the OpenAI o3-mini model as well as its own reasoning process to research and answer the input question.

This module provides an agent that summarizes files' contents in order to generate embeddings for the database and summaries for the user.

This module sends a request to the model and handles the response. It is able to handle tool calls and responses.

OpenAI's tokenizer uses regexes that are not compatible with Erlang's regex engine. There are a couple of modules available on hex, but all of them require a working python installation, access to rustc, a number of external dependencies, and some env flags set to allow it to compile.

This module is used to split a string into chunks by the number of tokens, while accounting for other data that might be going with it to the API endpoint with the limited token count.

This module defines the behaviour for tool calls. Defining a new tool requires implementing the spec/0 and call/2 functions.

Cmd

Converses with the default AI agent.

Fnord is a code search tool that uses OpenAI's embeddings API to index and search code files.

Frobs are external tool call integrations. They allow users to define external actions that can be executed by the LLM while researching the user's query.

Git

Module for interacting with git.

This behaviour wraps the AI-powered operations used by Cmd.Index to allow overrides for testing. See impl/0.

Conversations are stored per project in the project's store dir, under converations/. Each conversation is given a UUID identifier and stored as a JSON file with the keys

UI