Keywords: PHP | MySQL | SQL Import | Database Migration | File Processing
Abstract: This paper comprehensively explores various technical solutions for importing SQL files into MySQL databases within PHP environments. By analyzing common error cases, it详细介绍介绍了the implementation principles and applicable scenarios of methods such as using exec() function to execute system commands, line-by-line SQL file parsing, and mysqli_multi_query(). For SQL files of different sizes, corresponding optimization strategies and security recommendations are provided to help developers choose the most suitable import solution.
Problem Background and Error Analysis
In PHP development, database migration and backup recovery are common operational requirements. Users encounter errors when using the exec() function to call system commands for importing SQL files, specifically manifested as import process failures. Through error message analysis, several potential issues can be identified: system command execution permissions, file path resolution, database connection configuration, etc., all require careful examination.
Problems and Improvements in Traditional Methods
The original code uses the exec() function to directly call MySQL client commands:
$command = 'mysql -h' . $mysqlHostName . ' -u' . $mysqlUserName . ' -p' . $mysqlPassword . ' ' . $mysqlDatabaseName . ' < ' . $mysqlImportFilename;
exec($command, $output = array(), $worked);Although this method is straightforward, it presents multiple security risks and compatibility issues. First, passwords appearing in plain text within commands may be captured by system process monitoring tools. Second, different operating systems have variations in command parsing, potentially leading to path or parameter recognition errors. More importantly, this method completely relies on the availability of external MySQL client tools, which are often restricted in shared hosting environments.
Implementation of Line-by-Line SQL File Parsing
To address the above issues, a pure PHP implementation of SQL file parsing can be adopted:
$filename = 'churc.sql';
$mysql_host = 'localhost';
$mysql_username = 'root';
$mysql_password = '';
$mysql_database = 'dump';
$connection = mysqli_connect($mysql_host, $mysql_username, $mysql_password);
if (!$connection) {
die('Database connection failed: ' . mysqli_connect_error());
}
mysqli_select_db($connection, $mysql_database) or die('Database selection failed: ' . mysqli_error($connection));
$templine = '';
$lines = file($filename);
foreach ($lines as $line) {
if (substr($line, 0, 2) == '--' || trim($line) == '') {
continue;
}
$templine .= $line;
if (substr(trim($line), -1, 1) == ';') {
if (!mysqli_query($connection, $templine)) {
echo 'SQL execution error: ' . mysqli_error($connection) . '<br>';
}
$templine = '';
}
}
echo 'Database import completed';
mysqli_close($connection);The core idea of this method is to read the SQL file line by line, identify complete SQL statements (ending with semicolons), and execute them one by one. By skipping comment lines (starting with --) and empty lines, processing efficiency is improved. It should be noted that this method requires the entire SQL file to be read into memory at once, which may encounter memory limitations for extremely large files.
Large File Processing and Chunked Execution Strategy
For large SQL files that exceed memory limits, a streaming processing and chunked execution strategy is required:
$filename = 'large_dump.sql';
$maxRuntime = 30; // Maximum execution time (seconds)
$progressFile = $filename . '_progress';
$deadline = time() + $maxRuntime;
$filePosition = file_exists($progressFile) ? file_get_contents($progressFile) : 0;
$fp = fopen($filename, 'r');
fseek($fp, $filePosition);
$queryCount = 0;
$currentQuery = '';
while ($deadline > time() && ($line = fgets($fp, 1024000))) {
if (substr($line, 0, 2) == '--' || trim($line) == '') {
continue;
}
$currentQuery .= $line;
if (substr(trim($line), -1) == ';') {
if (!mysqli_query($connection, $currentQuery)) {
$error = 'Query execution failed: ' . mysqli_error($connection);
file_put_contents($filename . '_error', $error);
break;
}
$currentQuery = '';
$queryCount++;
file_put_contents($progressFile, ftell($fp));
}
}
if (feof($fp)) {
echo 'Import completed, total ' . $queryCount . ' queries executed';
unlink($progressFile);
} else {
echo 'Progress: ' . round(ftell($fp) / filesize($filename) * 100, 2) . '%, ' . $queryCount . ' queries executed';
}This solution achieves resumable uploads by saving file pointer positions. Even if script execution times out, the next execution can continue from the last interruption point. Combined with browser auto-refresh mechanisms, it can gradually complete the import of超大 files.
Optimized Solution Using mysqli_multi_query
For files containing multiple SQL statements, mysqli_multi_query() provides a more efficient batch execution approach:
$sqlContent = file_get_contents('mysqldump.sql');
$mysqli = new mysqli('localhost', 'root', 'password', 'testdb');
if ($mysqli->connect_error) {
die('Connection failed: ' . $mysqli->connect_error);
}
if ($mysqli->multi_query($sqlContent)) {
do {
if ($result = $mysqli->store_result()) {
$result->free();
}
} while ($mysqli->more_results() && $mysqli->next_result());
}
if ($mysqli->errno) {
echo 'Execution error: ' . $mysqli->error;
} else {
echo 'All queries executed successfully';
}
$mysqli->close();This method can execute all SQL statements in the file at once, ensuring complete execution of all queries through循环 processing of multiple result sets. Compared to executing statements one by one, it reduces database connection communication overhead and improves overall performance.
Security Considerations and Best Practices
In practical applications, several key points need attention: file upload security verification, SQL injection protection, error logging, transaction processing mechanisms, etc. It is recommended to add comprehensive exception handling and data validation logic in production environments to ensure the reliability and security of the import process.