Join 1M+ other developers and:
- Get help and share knowledge in Q&A
- Subscribe to topics of interest
- Get courses & tools that help you grow as a developer or small business owner
How to backup a managed postgres database with 10000+ tables
We have a managed postgres database with digital ocean with a large number of tables. Running a normal
pg_dump will crash as it needs a lock per table and is exceeding whatever the managed clusters configured postgresql
max_locks_per_transaction is, which I do not believe we can change ourselves.
pg_dump: WARNING: out of shared memory pg_dump: SQL command failed pg_dump: Error message from server: ERROR: out of shared memory HINT: You might need to increase max_locks_per_transaction.
We cannot instead use
pg_basebackup to do backups as the configured database user/host does not have permissions to connect as a replication connection.
And so is it literally impossible via normal postgres commands to backup an entire digital ocean managed database if you go over a certain table limit?
These answers are provided by our Community. If you find them useful, show some love by clicking the heart. If you run into issues leave a comment, or add your own answer to help others.×