If you have a PostgreSQL database that you want to import into DigitalOcean Managed Databases, you need the following:
An export of the existing database, which you can get using
pg_dump or other utilities.
A PostgreSQL database cluster created in your DigitalOcean account.
A database in the database cluster to import your data into. You can use the default database or create a new database.
pg_dump, you need specify the connection details (like admin username and database) and redirect the output of the command to save the database dump. The command will look like this:
pg_dump -h use_your_host -U use_your_username -Fc use_your_database > your_dump_file.pgsql
The components of the command are:
-h flag to specify the IP address or hostname, if using a remote database.
-U flag to specify the admin user on the existing database.
-P flag to prompt for the user’s password.
-Fc flags to specify the format of the output file.
The name of the database to dump
The redirection to save the database dump to a file called
Learn more in PostgreSQL’s SQL Dump documentation.
The time to export increases with the size of the database, so a large database will take some time. When the export is complete, you’ll be returned to the command prompt or notified by the client you used.
To import the database, you need to connect with psql and use the connection URI for the database you want to add the existing data to.
If you want to use the default database and default user, you can use the public network connection string from the cluter’s Overview page. If you want to import with a different user or to a different database, you need to edit the connection URI.
The connection URI is in the following format. In this example, the username, password, and database name are in ALL CAPS. These are the sections you need to substitute in the connection URI for your cluster if you want to use a non-default database or user:
Once you have the connection URI for the database and user you want to use, the command to import the dump file looks like this:
pg_restore -d 'use_your_connection_URI' --jobs 4 use_your_dump_file.pgsql
--jobs 4 flag tells PostgreSQL to use 4 CPU cores for this import so it takes less time. If you have a different number of available CPU cores, you can adjust that number.
If the database you’re importing used multiple users, you can add the
--no-owner flag to avoid permissions errors. Even without this command, the import will complete, but you may see a number of error messages.
Learn more in PostgreSQL’s documentation, Restoring the Dump.
Once the import is complete, you can update the connection information in any applications using the database to use the new database cluster.
We also recommend running the PostgreSQL-specific
ANALYZE command to generate statistical database information. This helps the query planner optimize the execution plan, which increases the speed that the database executes SQL queries. Learn more in the PostgreSQL wiki introduction to VACUUM, ANALYZE, EXPLAIN, and COUNT.