Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement memory for chat and process apps #444

Open
anada10 opened this issue Aug 13, 2024 · 0 comments
Open

Implement memory for chat and process apps #444

anada10 opened this issue Aug 13, 2024 · 0 comments
Assignees
Labels
enhancement New feature or request

Comments

@anada10
Copy link
Contributor

anada10 commented Aug 13, 2024

Problem
When models hit their context limit, apps stop working.
We need a way to continue the chat/process app without losing context of the previous chat.

Solution
Long term Memory with agents

We need two implementations

  1. For chat apps we can use
  1. For process apps we can use
@anada10 anada10 added the enhancement New feature or request label Aug 13, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
Status: No status
Development

No branches or pull requests

2 participants