View Source AI.Splitter (fnord v0.5.8)

This module is used to split a string into chunks by the number of tokens, while accounting for other data that might be going with it to the API endpoint with the limited token count.

For example, the search entry agent may be processing a large file, one that must be split into 3 slices just to fit it into the payload of an API call. In order to retain context between chunks, the agent essentially reduces over the file, keeping track of information in the previous chunks to generate a final summary. Doing that means that we need to not only split the file by the number of tokens in each slice, but also keep some space for the bespoke data that will be added to the payload as the agent's "accumulator".

Summary

Functions

new(input, max_tokens, model)

next_chunk(tok, bespoke_input)