Efficient Data Import from Text Files to MySQL Database Using LOAD DATA INFILE

Nov 21, 2025 · Programming · 17 views · 7.8

Keywords: MySQL | Data Import | LOAD DATA INFILE | Text Files | Batch Processing

Abstract: This article provides a comprehensive guide on using MySQL's LOAD DATA INFILE command to import large text file data into database tables. Focusing on a 350MB tab-delimited text file, the article offers complete import solutions including basic command syntax, field separator configuration, line terminator settings, and common issue resolution. Through practical examples, it demonstrates how to import data from text_file.txt into the PerformanceReport table of the Xml_Date database, while comparing performance differences between LOAD DATA and INSERT statements to provide best practices for large-scale data import.

Overview of MySQL Data Import

In database management systems, efficiently importing external data into database tables is a common and crucial task. MySQL provides the specialized LOAD DATA INFILE command to handle large-scale data imports, offering significant performance advantages over traditional INSERT statements, particularly suitable for processing large data files.

Basic LOAD DATA INFILE Syntax

For importing the 350MB text_file.txt file into the PerformanceReport table of the Xml_Date database, the most basic command format is:

LOAD DATA INFILE '/tmp/text_file.txt' INTO TABLE PerformanceReport;

This command defaults to using tabs as field separators and newlines as line terminators, automatically recognizing and parsing the data format in the text file.

Field Separator and Line Terminator Configuration

While default settings work for most scenarios, explicit specification of separators may be necessary in specific environments:

LOAD DATA LOCAL INFILE '/tmp/text_file.txt' 
INTO TABLE PerformanceReport 
COLUMNS TERMINATED BY '\t' 
LINES TERMINATED BY '\n';

For files generated by different operating systems, line terminators may require corresponding adjustments. Windows systems typically use \r\n, while macOS systems may use \r as line terminators.

Local File Access Permission Configuration

If permission errors occur when executing the LOAD DATA LOCAL INFILE command, local file access functionality needs to be enabled in MySQL configuration. This can be resolved by adding the --local-infile=1 parameter when starting MySQL, or by setting local_infile=ON in the MySQL configuration file.

Selective Field Mapping

When the column order in the text file doesn't exactly match the database table structure, variable mapping can be used to specify field correspondences:

LOAD DATA LOCAL INFILE '/tmp/text_file.txt' INTO TABLE PerformanceReport
FIELDS TERMINATED BY '\t' LINES TERMINATED BY '\n'
(@col1, @col2, @col3, @col4, @col5, @col6) 
SET field1 = @col1, field2 = @col2, field3 = @col3;

This approach allows flexible handling of column mapping relationships between source files and target tables, ignoring unnecessary columns or rearranging column order.

NULL Value Handling

Representing NULL values in text files requires using \N (backslash followed by capital N), which differs from directly using the NULL keyword in INSERT statements. For example, if a field value is empty, it should be represented as \N in the text file.

Performance Optimization Recommendations

For large 350MB data files, it's recommended to temporarily disable indexes and constraints before import, then re-enable them after completion, which can significantly improve import speed. Additionally, ensure the MySQL server has sufficient memory and disk I/O performance to handle large-scale data import operations.

Error Handling and Verification

After import completion, warning messages in the command execution results should be checked. The SHOW WARNINGS command can display detailed warning information, helping identify data format mismatches or other import issues. Use SELECT COUNT(*) to verify whether the number of imported records matches the source file.

Comparison with INSERT Statements

Compared to sequentially executed INSERT statements, LOAD DATA INFILE has clear advantages in batch data processing. The former requires parsing each SQL statement, while the latter directly reads file content, reducing SQL parsing overhead, making it particularly suitable for large-scale data import scenarios involving hundreds of thousands or even millions of records.

Copyright Notice: All rights in this article are reserved by the operators of DevGex. Reasonable sharing and citation are welcome; any reproduction, excerpting, or re-publication without prior permission is prohibited.