Resolving "New transaction is not allowed because there are other threads running in the session" Error in Entity Framework

Nov 23, 2025 · Programming · 7 views · 7.8

Keywords: Entity Framework | Transaction Conflict | Data Synchronization | SqlException | Performance Optimization

Abstract: This article provides an in-depth analysis of the common SqlException error "New transaction is not allowed because there are other threads running in the session" in Entity Framework. Through detailed code examples and principle analysis, it explains the issues that arise when performing both data reading and saving operations within foreach loops, and offers effective solutions including data pre-loading using IList<T> and chunked query processing. The article also discusses performance differences and applicable scenarios for various solutions, helping developers fundamentally understand Entity Framework's data access mechanisms.

Problem Background and Error Analysis

When using Entity Framework for database operations, developers frequently encounter a typical SqlException error: System.Data.SqlClient.SqlException: New transaction is not allowed because there are other threads running in the session. This error typically occurs when attempting to perform save operations while data reading is still in progress.

Root Cause of the Error

The fundamental cause of this error lies in Entity Framework's data access mechanism. When using LINQ queries that return IQueryable<T> objects and iterating through them in foreach loops, an active database reader is maintained. If SaveChanges() is called while this reader is still open, it triggers new transaction operations, resulting in conflicts.

Consider the following typical problematic code pattern:

var clientList = from a in _dbFeed.Client.Include("Auto") select a;
foreach (var client in clientList)
{
    // Perform some operations
    _dbRiv.SaveChanges();  // Exception thrown here
}

In this example, clientList is a deferred execution query, and each iteration of the foreach loop maintains a connection to the database. When SaveChanges() is called inside the loop, Entity Framework attempts to start a new transaction, but since the data reader is still active, this causes transaction conflicts.

Core Solution: Pre-loading Data into Memory

The most direct and effective solution is to pre-load query results into memory before operating on the collection. This can be achieved by calling ToList() or ToArray() methods:

IList<Client> clientList = (from a in _dbFeed.Client.Include("Auto") select a).ToList();
foreach (var client in clientList)
{
    var companyFeedDetailList = (from a in _dbRiv.AutoNegotiationDetails 
                                where a.ClientID == client.ClientID 
                                select a).ToList();
    
    foreach (var companyFeedDetail in companyFeedDetailList)
    {
        // Execute business logic
        if (companyFeedDetail.FeedSourceTable.ToUpper() == "AUTO")
        {
            var company = (from a in _dbRiv.Company.Include("Product") 
                          where a.CompanyId == companyFeedDetail.CompanyId 
                          select a).First();
            
            // Process product synchronization logic
            foreach (var sourceProduct in client.Auto)
            {
                bool found = false;
                foreach (var targetProduct in company.Product)
                {
                    if (targetProduct.alternateProductID == sourceProduct.AutoID)
                    {
                        found = true;
                        break;
                    }
                }
                
                if (!found)
                {
                    var newProduct = new Product
                    {
                        alternateProductID = sourceProduct.AutoID,
                        isFromFeed = true,
                        isDeleted = false,
                        SKU = sourceProduct.StockNumber
                    };
                    company.Product.Add(newProduct);
                }
            }
            _dbRiv.SaveChanges();  // Now executes normally
        }
    }
}

By converting query results to IList<T>, we effectively complete all database reading operations before the loop starts, thus avoiding conflicts between active readers and save operations.

Performance Optimization: Chunked Processing for Large Datasets

For scenarios involving large amounts of data, loading all data into memory at once may cause performance issues. In such cases, chunked processing can be employed:

public static class EntityFrameworkExtensions
{
    public static IEnumerable<T[]> QueryInChunks<T>(this IQueryable<T> query, int chunkSize)
    {
        int skipCount = 0;
        T[] chunk;
        
        do
        {
            chunk = query.Skip(skipCount).Take(chunkSize).ToArray();
            if (chunk.Length > 0)
            {
                yield return chunk;
                skipCount += chunkSize;
            }
        } while (chunk.Length == chunkSize);
    }
}

// Using chunked processing
var orderedClients = _dbFeed.Client.Include("Auto").OrderBy(c => c.ClientID);
foreach (var chunk in orderedClients.QueryInChunks(100))
{
    foreach (var client in chunk)
    {
        // Process each client
        var details = _dbRiv.AutoNegotiationDetails
            .Where(d => d.ClientID == client.ClientID)
            .ToList();
        
        foreach (var detail in details)
        {
            // Execute business logic
        }
    }
    _dbRiv.SaveChanges();  // Save after processing 100 records
}

This approach avoids transaction conflicts while reducing memory usage, making it particularly suitable for handling large-scale datasets.

Best Practice Recommendations

In actual development, it's recommended to follow these best practices:

  1. Define Clear Data Operation Boundaries: Clearly separate data reading and data modification operations, avoiding mixing them within the same data context.
  2. Use Transactions Appropriately: For scenarios requiring atomic operations, explicitly use transaction scopes:
    using (var transaction = _dbRiv.Database.BeginTransaction())
    {
        try
        {
            // Perform data operations
            _dbRiv.SaveChanges();
            transaction.Commit();
        }
        catch
        {
            transaction.Rollback();
            throw;
        }
    }
  3. Monitor Performance: For large-scale data operations, use SQL Profiler to monitor generated SQL statements and ensure query efficiency.
  4. Error Handling: Add appropriate exception handling and retry mechanisms to data synchronization operations.

Conclusion

Transaction conflict errors in Entity Framework typically stem from insufficient understanding of data access mechanisms. By pre-loading data into memory or adopting chunked processing strategies, these issues can be effectively resolved. Understanding Entity Framework's deferred execution characteristics and data context working mechanisms is crucial for writing robust data access code. In practical projects, appropriate solutions should be selected based on data volume and performance requirements.

Copyright Notice: All rights in this article are reserved by the operators of DevGex. Reasonable sharing and citation are welcome; any reproduction, excerpting, or re-publication without prior permission is prohibited.