4 Ways to Copy MySQL Database to localhost

Today I spent a few hours to copy one of my MySQL databases from remote server to the localhost in my laptop. The WordPress database is quite large and has a record of 300MB+. The process was not smoothly, I tried a few of methods and finally succeeded in exporting and importing the MySQL database.

1. phpMyAdmin

phpMyAdmin was the 1st idea which came up in my mind to export the database as I installed the application on both remote server and my local computer. In the phpMyAdmin of server end, select the database and click the tab Export, I am ready to save the database to local machine. There are options to select Quick and Custom model to export the database. Quick model is simply exporting everything in the database; Custom model you can select specific tables to export and do variety of settings. And I also have option to select various of format to export. I selected the default format SQL, and click Go.

export

For a small database, maybe there’s no issue with this method. But my case, I ended with the Fatal error: Allowed memory size of xxxxxxx bytes exhausted. This is understandable as phpMyAdmin is built on php. The settings in php.ini may create bottleneck when exporting large database. Here are 3 setting items may have impact to export MySQL database.

memory_limit – Default value 128M

This sets the maximum amount of memory in bytes that a script is allowed to allocate. This helps prevent poorly written scripts for eating up all available memory on a server. Note that to have no memory limit, set this directive to -1.

post_max_size – Default value 8M.

Sets max size of post data allowed. This setting also affects file upload. To upload large files, this value must be larger than upload_max_filesize. If memory limit is enabled by your configure script, memory_limit also affects file uploading. Generally speaking, memory_limit should be larger than post_max_size.

max_execution_time – The default setting is 30.

This sets the maximum time in seconds a script is allowed to run before it is terminated by the parser. This helps prevent poorly written scripts from tying up the server. When running PHP from the command line the default setting is 0.

Of course, to do all the settings, you need access of php.ini on remote server. In my case, I tried the other way.

2. MySQL Workbench

MySQL Workbench is a visual database design tool that integrates SQL development, administration, database design, creation and maintenance into a single integrated development environment for the MySQL database system. Download.

After install MySQL Workbench on local machine, then do the following steps to connect to remote database in MySQL Workbench.

  • In Google, search keyword “my ip address” or click whatismyipaddress.com to find your ip address.
  • In cPanel, click “Remote MySQL” and add the ip address to whitelist the current local ip address.
  • remote mysql
  • In MySQL Workbench, click: Database->Manage Connections->New. Input Hostname, Username, Password of remote server, etc to establish a new connection.
  • In In MySQL Workbench, click: Server->Export to save remote database to local machine.

Similar in MySQLAdmin, here there are two options to export the database.

Export to Dump Project Folder – export all tables in the folder.
Export to Self-Contained File – export selected tables. This is useful when a certain table is too large.

3. Command Line

First, SSH/Shell Access with PuTTY.

Export MySQL database in command line

Dump the database and gzip it at the same time, use the following command.

mysqldump -u [username] -p [database_name] | gzip > [/path_to_file/database_name].sql.gz

Import MySQL Database in command line

unzip the gz file.

gzip -d [/path_to_file/database_name].sql.gz

Import sql file into database.

mysql -u username -p database_name < file.sql

4. BigDump

BigDump does staggered import of large and very large MySQL Dumps (like phpMyAdmin dumps) even through the web servers with hard runtime limit and those in safe mode. The script executes only a small part of the huge dump and restarts itself. The next session starts where the last was stopped.

  • Download and unzip bigdump.zip on your PC.
  • Open bigdump.php in a text editor, adjust the database configuration and dump file encoding.
  • Drop the old tables on the target database if your dump doesn’t contain “DROP TABLE” (use phpMyAdmin).
  • Create the working directory (e.g. dump) on your web server
  • Upload bigdump.php and the dump files (*.sql or *.gz) via FTP to the working directory (take care of TEXT mode upload for bigdump.php and dump.sql but BINARY mode for dump.gz if uploading from MS Windows).
  • Run the bigdump.php from your web browser via URL like http://www.yourdomain.com/dump/bigdump.php.
  • Now you can select the file to be imported from the listing of your working directory. Click “Start import” to start.
  • BigDump will start every next import session automatically if JavaScript is enabled in your browser.
  • Relax and wait for the script to finish. Do NOT close the browser window!
  • IMPORTANT: Remove bigdump.php and your dump files from your web server.

Leave a Reply

Your email address will not be published. Required fields are marked *