How to Backup and Import a Very Large WordPress MySQL Database

How to Backup and Import a Very Large WordPress MySQL Database

I recently had the experience of working with a very large (over 40MB) database for a WPMU + BuddyPress + bbPress site that was just plain too big to export and import with phpMyAdmin.

All I wanted to do was create a simple backup and then a test environment on another domain. You may have experienced this before – all the normal tools stall out, making the usual import/export process a frustrating experience. This is especially common with large WPMU sites that have been around for a little while.

You’ll want to bookmark this reference if you anticipate having to create and import backups of large WordPress Multisite and/or BuddyPress sites.

The solution that works for you will depend entirely on how your server is set up, the size of your database, and what tools you’re comfortable using. Here are a few tricks that you can try to get your backups and imports working. My instructions are based on using PuTTY for Windows but the process will be similar for other operating systems.

If you have root access to your server, create a backup using SSH

cd to the directory where you want to store your dump file:

cd /var/www/vhosts/yourdomain.com/httpdocs/sqldump

Then tell mysql to dump it out:

mysqldump --add-drop-table -u dbuser -p your_db > mybackup.sql

It will prompt you for your password. Everything should dump out without a problem.

*Note: The directory where you’re sending the dump file needs to be writeable by the server. Also, you don’t really want to leave it in that folder permanently. Just drop it there and pull it off your server so that you can manipulate it if you have to.

How to Import a Large SQL File

Importing is usually a bit messier. If you’re duplicating this database for testing purposes or moving your site to a new server, a large dump file, even when gzipped, can pose some challenges in the import process. This is the part where things can really stall out, and you need a few tricks up your sleeve. Here are three methods that are recommended for large SQL files.

Method 1: First try SSH

Ordinarily, I would just use phpMyAdmin to import the file. Because it’s so large, the first thing you should do is try importing via SSH:

mysql -u your_database -p db_user < mybackup.sql

Common Error when using SSH:

“got a packet bigger than ‘max_allowed_packet’ bytes”

Now what? Depending on the size of your file, running this command may fix it for you:

mysql> set global max_allowed_packet=1000000000;
mysql> set global net_buffer_length=1000000;

FREE EBOOK
Your step-by-step roadmap to a profitable web dev business. From landing more clients to scaling like crazy.

By downloading this ebook I consent to occasionally receive emails from WPMU DEV.
We keep your email 100% private and do not spam.

FREE EBOOK
Plan, build, and launch your next WP site without a hitch. Our checklist makes the process easy and repeatable.

By downloading this ebook I consent to occasionally receive emails from WPMU DEV.
We keep your email 100% private and do not spam.

Then try importing your database after this. If it still doesn’t work, you might try splitting it up.

Method 2: If SSH fails on Import, try splitting up your SQL file

Download a program that will split up your SQL file, such as SQLDumpSplitter3. This is a free SQL dump file splitter that will let you determine how big your chops will be and will automatically cut and save your split SQL file. Yes, this is a pain, but it is sometimes the only way to get this accomplished.

SQLDumpSplitter3
SQLDumpSplitter3

Create the splits and then upload them to a directory on your server.

If you want to restore the dump, you have to execute the yourdatabase_DataStructure.sql first because it contains the Tables structure. After that, you can execute the other .sql-files as they contain the data of the now existing tables. Using SSH, CD to your directory again and make sure to send this first:

mysql -u db_user -p db_name < yourbackup_DataStructure.sql

Then your splits:

mysql -u db_user -p db_name < yourbackup_1.sql
mysql -u db_user -p db_name < yourbackup_2.sql
etc…

I would only recommend this if your SQL file is easily split into just a few sections. Obviously, if it’s going to take more than a handful, this technique can be time-consuming. I know, because I tried it before I discovered one other method.

Method 3: Use a Script that will stagger the import process

This is my favorite solution, as it was the only one that worked smoothly. I used a script called BigDump.php, a staggered SQL dump importer. It executes only a small part of the huge dump and restarts itself. The next session starts where the last was stopped to keep you from running into your server’s limits. Instructions for using it are on the BigDump website. Basically you place your SQL file in a folder on your server, along with the bigdump.php file. You edit that file with your database information and then visit the page on your server and set the import to go. This is a fairly quick process and will save you a lot of time.

There are some smaller things that you can do to help reduce the size of your database, which we outlined in an article called 10 Tips For Keeping a Squeaky Clean WordPress Database. Backup your database and look into some of those to trim it down.

I’m sure there are other tricks to try to get past the limits for large files, so please feel free to post in the comments if you know of anything else that I’m forgetting or might help other WordPress users.

Tags: