View Source AI.Accumulator (fnord v0.5.5)
When file or other input to too large for the model's context window, this
module may be used to process the file in chunks. It automatically modifies
the supplied agent prompt to include instructions for accumulating a response
across multiple chunks based on the max_tokens
parameter supplied to the
get_response
function.
Note that while this makes use of the AI.Completion
module, it does NOT
have the same interface and cannot be used for long-running conversations, as
it does not accept a list of messages as its input.