# pg_dump > [!caution] > TODO Command-line tool for creating backups of a **single** PostgreSQL database.
Consider using [`pg_dumpall`][pg_dumpall] to create backups of entire clusters, or global objects like roles and tablespaces. 1. [TL;DR](#tldr) 1. [Further readings](#further-readings) 1. [Sources](#sources) ## TL;DR It can dump a database in its entirety, or just specific parts of it such as individual tables or schemas.
It does **not** dump objects like roles, groups, tablespace and others. Consider using [`pg_dumpall`][pg_dumpall] for those. It produces sets of SQL statements that can be executed to reproduce the original databases' object definitions and table data. Suitable when: - The database' size is **less** than 100 GB.
It tends to start giving issues for bigger databases. - One plans to migrate the database' metadata as well as the table data. - There is a relatively large number of tables to migrate. > [!important] > `pg_dump` works better when the database is taken offline, but it **does keep the database available** and will > **not** prevent users from accessing it.
> Even with other users accessing the database during the backup process, `pg_dump` will **always** produce consistent > results thanks to ACID properties. ## Further readings - [PostgreSQL] - [pg_dumpall] - [pg_restore] ### Sources - [Documentation] - [A Complete Guide to pg_dump With Examples, Tips, and Tricks] - [How to speed up pg_dump when dumping large databases] [pg_dumpall]: pg_dumpall.md [pg_restore]: pg_restore.md [PostgreSQL]: README.md [Documentation]: https://www.postgresql.org/docs/current/app-pgdump.html [A Complete Guide to pg_dump With Examples, Tips, and Tricks]: https://www.dbvis.com/thetable/a-complete-guide-to-pg-dump-with-examples-tips-and-tricks/ [How to speed up pg_dump when dumping large databases]: https://postgres.ai/docs/postgres-howtos/database-administration/backup-recovery/how-to-speed-up-pg-dump