How to export import a large MySql live database (from dump sql)

mysql logoWe should has access to local mysql service mysql, mysqldump, when we had really big db we should consider to stop mysql service and copy mysql/data binaries between instance other ways … try my holly grail !

Disclaimer

Your data are not sensitive (history of transfers, banks etc..)
This solution are for Analytics data
Your dump file are corrupted (really i don’t understand why mysql dump generate invalid sql files ;/)
You don’t want shutdown service for long time.. (remember mysql while import block tables..)

Export database workflow

mysqldump -uuser -ppassword database_name > database_name.sql
tar -zcvf database_name.sql.tar.gz database_name.sql

Transfer database file between servers..

Import database worfkflow

Import database with watch progress of import process with log errors and force skip errors (continue import on errors)

Target server configuration (my.cnf) add lines:
innodb_write_io_threads = 16
innodb_flush_log_at_trx_commit = 0

extract dump files, if you can down mysql service for seconds
tar -zxvf database_name.sql.tar.gz
service mysql restart --innodb-doublewrite=0

import data, if u don’t have installed pv chut do it aptitude install pv
pv database_name.sql | mysql -uusername -ppassword -f -D database_name > mysql.log 2>&1
After finish importing process restart mysql again w/o flag “–innodb-doublewrite=0”
service mysql restart