How to backup a managed postgres database with 10000+ tables


We have a managed postgres database with digital ocean with a large number of tables. Running a normal pg_dump will crash as it needs a lock per table and is exceeding whatever the managed clusters configured postgresql max_locks_per_transaction is, which I do not believe we can change ourselves.

pg_dump: WARNING:  out of shared memory
pg_dump: SQL command failed
pg_dump: Error message from server: ERROR:  out of shared memory
HINT:  You might need to increase max_locks_per_transaction.

We cannot instead use pg_basebackup to do backups as the configured database user/host does not have permissions to connect as a replication connection.

And so is it literally impossible via normal postgres commands to backup an entire digital ocean managed database if you go over a certain table limit?


Submit an answer
Answer a question...

This textbox defaults to using Markdown to format your answer.

You can type !ref in this text area to quickly search our full set of tutorials, documentation & marketplace offerings and insert the link!

Sign In or Sign Up to Answer

These answers are provided by our Community. If you find them useful, show some love by clicking the heart. If you run into issues leave a comment, or add your own answer to help others.

Rajkishore J
DigitalOcean Employee
DigitalOcean Employee badge
July 21, 2021
Accepted Answer

Hi @nigelgott,

You can open a support ticket with us to increase ‘max_locks_per_transaction’, we will be able to help you with this.

Regards, Rajkishore