Optimizing Bulk Data Insertion into SQL Server with C# and SqlBulkCopy

Nov 27, 2025 · Programming · 11 views · 7.8

Keywords: SqlBulkCopy | Bulk Insert | SQL Server | C# | Performance Optimization

Abstract: This article explores efficient methods for inserting large datasets, such as 2 million rows, into SQL Server using C#. It focuses on the SqlBulkCopy class, providing code examples and performance optimization techniques including minimal logging and index management to enhance insertion speed and reduce resource consumption.

Introduction

Inserting large datasets into a database is a common challenge in data-intensive applications. For instance, loading 2 million rows from a text file into SQL Server requires efficient methods to avoid performance bottlenecks and resource exhaustion. Based on the Q&A data and reference article, this article discusses best practices, with a focus on the SqlBulkCopy class in C#.

Leveraging SqlBulkCopy for High-Performance Insertion

The SqlBulkCopy class in the System.Data.SqlClient namespace is designed for bulk loading data into SQL Server tables. It bypasses many of the overheads associated with individual INSERT statements, making it ideal for large-scale data operations. Key options include TableLock to reduce lock contention, FireTriggers to maintain data integrity, and UseInternalTransaction for atomicity.

Here is a basic example of using SqlBulkCopy:

using System.Data.SqlClient;

string connectionString = "Your_Connection_String";
DataTable dataTable = // Assume dataTable is populated with data

using (SqlConnection connection = new SqlConnection(connectionString))
{
    SqlBulkCopy bulkCopy = new SqlBulkCopy(connection, SqlBulkCopyOptions.TableLock | SqlBulkCopyOptions.FireTriggers | SqlBulkCopyOptions.UseInternalTransaction, null);
    bulkCopy.DestinationTableName = "YourDestinationTable";
    connection.Open();
    bulkCopy.WriteToServer(dataTable);
    connection.Close();
}

This code efficiently transfers data from a DataTable to the specified SQL Server table. For very large datasets, consider batching the data to manage memory usage, as demonstrated in other answers.

Performance Optimization Techniques

To further enhance insertion speed, several optimizations can be applied. From the reference article, using minimal logging by switching to the BULK_LOGGED recovery model during bulk operations can significantly reduce log file growth and improve performance. Additionally, temporarily dropping non-clustered indexes before insertion and recreating them afterward can speed up the process, as index maintenance during inserts can be costly.

Other tips include:

For example, in scenarios with 40 million rows, as mentioned in the reference, adopting these strategies reduced insertion time from 30-40 minutes to under 2 minutes in some cases.

Handling Batches for Memory Efficiency

When dealing with extremely large datasets, such as 2 million rows, loading all data into memory at once might not be feasible. A practical approach is to process the data in batches. Answer 3 provides a custom class that splits the data into chunks and uses SqlBulkCopy for each batch. Here's a simplified version:

// Example of batching logic
int batchSize = 1000;
for (int i = 0; i < dataTable.Rows.Count; i += batchSize)
{
    DataTable batch = dataTable.Clone();
    for (int j = i; j < Math.Min(i + batchSize, dataTable.Rows.Count); j++)
    {
        batch.ImportRow(dataTable.Rows[j]);
    }
    // Use SqlBulkCopy on batch
    // Similar code as above
}

This method prevents memory overflow and allows for incremental processing.

Conclusion

In summary, for fast insertion of large datasets into SQL Server, SqlBulkCopy is the recommended tool in C#. Coupled with performance optimizations like minimal logging, index management, and batching, it can handle millions of rows efficiently. Always test with your specific environment to fine-tune parameters such as batch size and recovery models.

Copyright Notice: All rights in this article are reserved by the operators of DevGex. Reasonable sharing and citation are welcome; any reproduction, excerpting, or re-publication without prior permission is prohibited.