Comprehensive Analysis of Converting DataReader to List<T> Using Reflection and Attribute Mapping

Nov 21, 2025 · Programming · 10 views · 7.8

Keywords: DataReader | Reflection Mapping | Attribute Mapping | C# Data Access | ORM Comparison

Abstract: This paper provides an in-depth exploration of various methods for efficiently converting DataReader to List<T> in C#, with particular focus on automated solutions based on reflection and attribute mapping. The article systematically compares different approaches including extension methods, reflection-based mapping, and ORM tools, analyzing their performance, maintainability, and applicable scenarios. Complete code implementations and best practice recommendations are provided to help developers select the most appropriate DataReader conversion strategy based on specific requirements.

Introduction

In .NET data access layer development, the conversion between IDataReader and strongly-typed collections List<T> represents a common requirement. While traditional manual mapping approaches are intuitive, they suffer from code redundancy, high maintenance costs, and susceptibility to errors. Building upon technical discussions from highly-rated Stack Overflow answers, this paper systematically analyzes several mainstream conversion solutions, with particular emphasis on automated approaches based on reflection and attribute mapping.

Core Challenges in DataReader Conversion

IDataReader, as a forward-only, read-only data stream reader, provides efficient access to database query results. However, converting it to strongly-typed List<T> presents several key challenges:首先是类型安全映射,requiring proper handling of conversions between database DBNull values and .NET nullable types;其次是性能考量,particularly concerning memory and CPU overhead when processing large datasets; finally, code maintainability necessitates avoiding repetition of similar mapping logic across multiple locations.

Projection-Based Conversion Using Extension Methods

The first approach employs LINQ-style extension methods, achieving type-safe conversion through projection functions. Core implementation as follows:

public static IEnumerable<T> Select<T>(this IDataReader reader, Func<IDataReader, T> projection)
{
    while (reader.Read())
    {
        yield return projection(reader);
    }
}

Usage example:

using (IDataReader reader = command.ExecuteReader())
{
    List<Customer> customers = reader.Select(r => new Customer 
    {
        CustomerId = r["id"] is DBNull ? null : (int?)r["id"],
        CustomerName = r["name"] is DBNull ? null : r["name"].ToString()
    }).ToList();
}

This approach benefits from type safety and compile-time checking, but requires explicit DBNull handling for each property mapping, resulting in substantial code volume.

Automated Mapping Using Reflection

The second approach utilizes reflection to achieve automatic matching between property names and database column names:

public static List<T> DataReaderMapToList<T>(IDataReader dr) where T : new()
{
    List<T> list = new List<T>();
    while (dr.Read()) 
    {
        T obj = new T();
        foreach (PropertyInfo prop in typeof(T).GetProperties())
        {
            if (dr[prop.Name] != DBNull.Value)
            {
                prop.SetValue(obj, dr[prop.Name]);
            }
        }
        list.Add(obj);
    }
    return list;
}

Usage pattern:

List<Customer> customers = DataReaderMapToList<Customer>(dataReader);

This method significantly reduces code volume but introduces performance overhead and naming constraints. Reflection operations exhibit slow initial execution and require exact matching between entity property names and database column names.

Advanced Attribute-Based Mapping Solution

The third approach combines reflection with custom attributes, providing more flexible column name mapping:

public class FieldAttribute : Attribute
{
    public string ColumnName { get; }
    public FieldAttribute(string columnName)
    {
        ColumnName = columnName;
    }
}

public static class DataReaderExtensions
{
    public static IEnumerable<T> AutoMap<T>(this IDataReader reader) where T : new()
    {
        var propertyMap = BuildPropertyMap<T>();
        while (reader.Read())
        {
            T obj = new T();
            foreach (var mapping in propertyMap)
            {
                if (reader[mapping.ColumnName] != DBNull.Value)
                {
                    mapping.Property.SetValue(obj, reader[mapping.ColumnName]);
                }
            }
            yield return obj;
        }
    }
    
    private static List<PropertyMapping> BuildPropertyMap<T>()
    {
        // Build property mapping cache
    }
}

Entity class definition:

public class CustomerDTO
{
    [Field("id")]
    public int? CustomerId { get; set; }
    
    [Field("name")]
    public string CustomerName { get; set; }
}

Usage pattern:

using (IDataReader reader = command.ExecuteReader())
{
    List<CustomerDTO> customers = reader.AutoMap<CustomerDTO>().ToList();
}

This solution decouples database column names from entity property names through attribute mapping, offering superior flexibility and maintainability. By caching mapping relationships, significant performance improvements can be achieved for repeated executions.

Performance Optimization and Caching Strategies

The performance bottleneck in reflection operations primarily stems from repeated acquisition of type information. Implementing mapping caching can substantially enhance performance:

private static readonly ConcurrentDictionary<Type, List<PropertyMapping>> 
    PropertyCache = new ConcurrentDictionary<Type, List<PropertyMapping>>();

private static List<PropertyMapping> GetPropertyMappings<T>()
{
    return PropertyCache.GetOrAdd(typeof(T), type =>
    {
        var mappings = new List<PropertyMapping>();
        foreach (var prop in type.GetProperties())
        {
            var fieldAttr = prop.GetCustomAttribute<FieldAttribute>();
            string columnName = fieldAttr?.ColumnName ?? prop.Name;
            mappings.Add(new PropertyMapping 
            { 
                Property = prop, 
                ColumnName = columnName 
            });
        }
        return mappings;
    });
}

Comparison with Existing ORM Tools

While custom mapping solutions provide flexibility, mature ORM tools like Entity Framework, NHibernate, or Dapper may be more appropriate for complex projects. Dapper, as a lightweight ORM, achieves an excellent balance between performance and usability:

using var connection = new SqlConnection(connectionString);
var customers = connection.Query<Customer>("SELECT id, name FROM Customers").ToList();

Dapper achieves performance接近原生ADO.NET through expression trees and pre-compilation, while providing strong typing support and automatic mapping capabilities.

Best Practice Recommendations

Select appropriate solutions based on project requirements: extension method projections suffice for simple, temporary conversions; attribute-based reflection mapping provides good balance for medium-complexity projects; mature ORM frameworks are recommended for large enterprise applications. Regardless of the chosen approach, attention should be paid to exception handling, resource release, and performance monitoring.

Conclusion

The conversion from DataReader to List<T> represents a fundamental operation in .NET data access layers. Automated solutions based on reflection and attribute mapping demonstrate clear advantages in reducing code redundancy and improving development efficiency. Through appropriate caching strategies and performance optimization, these solutions can meet the requirements of most application scenarios. However, when making architectural choices, comprehensive consideration should be given to project scale, team expertise, and long-term maintenance costs to select the most suitable technical solution.

Copyright Notice: All rights in this article are reserved by the operators of DevGex. Reasonable sharing and citation are welcome; any reproduction, excerpting, or re-publication without prior permission is prohibited.