A trick for exporting & importing large MySQL Databases
When it comes to working with large MySQL Databases, it can be a right pain to use PHPMyAdmin or even software like Navicat or Sequel Pro. You tend to run into all sorts of issues, from upload limits to server timeouts. The best method I've found is to let the server do all the hard work for you by using a few simple commands. You can run these commands directly through your server shell (not recommended) or through SSH (if your user has the right permissions).
Exporting a Database
To export a database use the
mysqldump command and specify the username, password, and database you'd like to dump along with the location of the dump file. Make sure you add the
--add-drop-table option, when the database is imported, the server will drop the table before recreating it and adding the data. This is important because the import will fail if a unique key already exists in the table.
mysqldump --add-drop-table -u "root" -p"db_password" db_name > /absolute/path/to/export/sql_file.sql
-p"db_password" that is not a mistake, the password must be defined directly after the
Importing a Database
To import a database we use the MySQL command and define our user, password, the database we'd like to import to, and the absolute path to the SQL file.
mysql -u "db_user" -p"db_password" db_name < /absolute/path/to/import/sql_file.sql
That's all there is to it! The next time you're struggling to import that 1.5GB database (Looking at you WordPress!) give the command line a shot. It's not so scary once you get used to it. Just make sure you have a back up :D