MySQL Back-up: Take a mysqldump with each database in its own SQL File

Want to help support this blog? Try out Oh Dear, the best all-in-one monitoring tool for your entire website, co-founded by me (the guy that wrote this blogpost). Start with a 10-day trial, no strings attached.

We offer uptime monitoring, SSL checks, broken links checking, performance & cronjob monitoring, branded status pages & so much more. Try us out today!

Profile image of Mattias Geniar

Mattias Geniar, August 24, 2015

Follow me on Twitter as @mattiasgeniar

If’s often very useful to have a couple of MySQL oneliners nearby. This guide will show you how to take a mysqldump of all databases on your server, and write each database to its own SQL file. As a bonus, I’ll show you how to compress the data and import it again – if you ever need to restore from those files.

Take a mysqldump back-up to separate files

To take a back-up, run the mysqldump tool on each available database.

$ mysql -N -e 'show databases' | while read dbname; do mysqldump --complete-insert --routines --triggers --single-transaction "$dbname" > "$dbname".sql; done

The result is a list of all your database files, in your current working directory, suffixed with the .sql file extension.

$ ls -alh *.sql

-rw-r--r-- 1 root root  44M Aug 24 22:39 db1.sql
-rw-r--r-- 1 root root  44M Aug 24 22:39 db2.sql

If you want to write to a particular directory, like /var/dump/databases/, you can change the output of the command like this.

$ mysql -N -e 'show databases' | while read dbname; do mysqldump --complete-insert --routines --triggers --single-transaction "$dbname" > /var/dump/databases/"$dbname".sql; done

Mysqldump each database and compress the SQL file

If you want to compress the files, as you’re taking them, you can run either gzip or bzip on the resulting SQL file.

$ mysql -N -e 'show databases' | while read dbname; do mysqldump --complete-insert --routines --triggers --single-transaction "$dbname" > "$dbname".sql; [[ $? -eq 0 ]] && gzip "$dbname".sql; done

The result is again a list of all your databases, but gzip'd to save diskspace.

$ ls -alh *.gz

-rw-r--r--  1 root root  30K Aug 24 22:42 db1.sql.gz
-rw-r--r--  1 root root 1.6K Aug 24 22:42 db1.sql.gz

This can significantly save you on diskspace at the cost of additional CPU cycles while taking the back-up.

Import files to mysql from each .SQL file

Now that you have a directory full of database files, with the database name in the SQL file, how can you import them all again?

The following for-loop will read all files, strip the “.sql” part from the filename and import to that database.

Warning: this overwrites your databases, without prompting for confirmation. Use with caution!

$ for sql in *.sql; do dbname=${sql/\.sql/}; echo -n "Now importing $dbname ... "; mysql $dbname < $sql; echo " done."; done

The output will tell you which database has been imported already.

$ for sql in *.sql; do dbname=${sql/\.sql/}; echo -n "Now importing $dbname ... "; mysql $dbname < $sql; echo " done."; done 

Now importing db1 ...  done.
Now importing db2 ...  done.

These are very simple one-liners that come in handy when you’re migrating from server-to-server.



Want to subscribe to the cron.weekly newsletter?

I write a weekly-ish newsletter on Linux, open source & webdevelopment called cron.weekly.

It features the latest news, guides & tutorials and new open source projects. You can sign up via email below.

No spam. Just some good, practical Linux & open source content.