AI.Accumulator (fnord v0.8.0)
View SourceWhen file or other input to too large for the model's context window, this
module may be used to process the file in chunks. It automatically modifies
the supplied agent prompt to include instructions for accumulating a response
across multiple chunks based on the context
(max context window tokens)
parameter supplied by the model
parameter.
Note that while this makes use of the AI.Completion
module, it does NOT
have the same interface and cannot be used for long-running conversations, as
it does not accept a list of messages as its input.