Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add a consistent pool manager #1488

Closed
Sapessii opened this issue Jun 30, 2024 · 1 comment · Fixed by #1758
Closed

Add a consistent pool manager #1488

Sapessii opened this issue Jun 30, 2024 · 1 comment · Fixed by #1758

Comments

@Sapessii
Copy link

Describe the bug
Fix the lack of a consistent pool manager.
Basically when using supabase and loading memgpt with tons of user I get that I have reached the max number of client connections.
For reference a message on discord (https://discord.com/channels/1161736243340640419/1162177332350558339/1257022675021463654)

If you have any better alternative of the db to use please let me know!

Please describe your setup

  • deployed docker on render.com

MemGPT Config
Just using the standard configuration with this db url: postgresql+pg8000://postgres.xxxxx:xxxxxx@aws-0-us-west-1.pooler.supabase.com:6543/postgres

Thank you :)

@zboyles
Copy link

zboyles commented Aug 5, 2024

This also impacts deployments with only a couple users, 2 + the default user account.

I setup new deployment in Google Cloud Run + Supabase (Nano) and followed the steps below.

  1. Python client:
  • Created 2 users
  1. Browser:
  • Opened 1 browser for each (2) user account
  1. Python client:
  • Created 2 humans
  • Created 2 persona
  • Created 2 agents
  1. Browser:
  • Sent a message and then noticed the Max client connections reached error appear in the logs prior to receiving a response.

Note: When I built the image and ran container locally, I executed a couple 'list' commands with the python client.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: Done
Development

Successfully merging a pull request may close this issue.

2 participants