Complete Guide to MySQL Database Export and Import from Command Line

Nov 13, 2025 · Programming · 35 views · 7.8

Keywords: MySQL | database_export | mysqldump | command_line | data_backup

Abstract: This comprehensive guide details the complete process of exporting and importing MySQL databases using the mysqldump command-line tool. It covers core scenarios including single database export, multiple database export, specific table export, remote export, and delves into advanced techniques such as compressed exports, user privilege migration, and handling large databases. Through detailed code examples and best practices, users will master essential skills for database backup, migration, and recovery.

Fundamentals of MySQL Database Export

In database management and migration processes, using command-line tools for data export is a fundamental and critical skill. mysqldump is the standard logical backup tool provided by MySQL, capable of exporting database structures and data into SQL text files for easy cross-system transfer and recovery.

Basic Export Commands

For exporting a single database, use the following command format:

mysqldump -u [username] -p [database_name] > db_backup.sql

After executing this command, the system will prompt for the password, avoiding plaintext passwords in command history. The exported SQL file will be saved in the current working directory.

Multiple Database Export

When backing up all databases on the server, use the --all-databases option:

mysqldump -u [username] -p --all-databases > all_db_backup.sql

This command exports all databases including system databases, generating a unified SQL file.

Specific Table Export

For cases where only specific tables from a database need to be exported, specify table names after the database name:

mysqldump -u [username] -p [database_name] [table1] [table2] > table_backup.sql

This method is particularly useful for large databases where only partial data migration is needed.

Compressed Export

For large databases, pipe the output directly to compression:

mysqldump -u [username] -p [database_name] | gzip > db_backup.sql.gz

This approach significantly reduces storage space usage and network transfer time.

Remote Database Export

When exporting data from a remote MySQL server, specify the host address and port:

mysqldump -P 3306 -h [IP_address] -u [username] -p [database_name] > db_backup.sql

Ensure the remote server allows connections and the user has appropriate access privileges.

Database Import Operations

Before importing a database, create the target database first:

mysql -u [username] -p -e "CREATE DATABASE [new_database_name]"

Then use the mysql client to import the data:

mysql -u [username] -p [new_database_name] < db_backup.sql

Advanced Export Options

For large databases using the InnoDB storage engine, the following option combination is recommended:

mysqldump -u [username] -p --single-transaction --quick [database_name] > backup.sql

The --single-transaction option creates a consistent snapshot within a transaction, avoiding table locks; the --quick option outputs data row by row, reducing memory usage.

User Privilege Migration

Standard database exports do not include user accounts and privilege information. Export user-related tables separately:

mysqldump -u root -p mysql user db tables_priv columns_priv procs_priv > users_privileges.sql

After import, execute FLUSH PRIVILEGES; to make privileges effective.

Performance Optimization Techniques

For large database imports, disable constraint checks before importing:

SET foreign_key_checks = 0;
SET unique_checks = 0;
SET autocommit = 0;

Re-enable after import completion:

SET foreign_key_checks = 1;
SET unique_checks = 1;
COMMIT;

Common Issue Resolution

When encountering "Access denied; you need PROCESS privilege" errors, add the --no-tablespaces option, or grant PROCESS privilege to the user:

GRANT PROCESS ON *.* TO [username]@[host];

Best Practice Recommendations

In actual production environments, regularly test the integrity and recoverability of backup files. For critical business databases, establish automated backup strategies including regular full backups and incremental backups. Additionally, backup files should be stored in physical locations different from the production environment to prevent single points of failure.

Copyright Notice: All rights in this article are reserved by the operators of DevGex. Reasonable sharing and citation are welcome; any reproduction, excerpting, or re-publication without prior permission is prohibited.