Never exceed OpenAI's chat models' maximum number of tokens when using the async_openai
Rust crate.
chat-splitter
splits chats into 'outdated' and 'recent' messages.
You can split by
both
maximum message count and
maximum chat completion token count.
We use tiktoken_rs
for counting tokens.
Here's a basic example:
```rust // Get all your previously stored chat messages... let mut storedmessages = /* getstored_messages()? */;
// ...and split into 'outdated' and 'recent', // where 'recent' always fits the context size. let (outdatedmessages, recentmessages) = ChatSplitter::default().split(&stored_messages); ```
For a more detailed example,
see examples/chat.rs
.
Contributions to chat-splitter
are welcome!
If you find a bug or have a feature request,
please submit an issue.
If you'd like to contribute code,
please feel free to submit a pull request.