
Dapper Plus Options Explained: Everything You Can Customize
Handling large amounts of data efficiently is a common challenge in modern .NET applications. While Dapper is known for its simplicity and speed, it intentionally avoids higher-level features like bulk operations. Dapper Plus fills that gap by adding powerful bulk insert, update, delete, and merge capabilities — with a strong focus on performance and control.
What truly sets Dapper Plus apart is the depth of customization options it provides. These options allow fine-grained control over mappings, bulk behaviors, batching, logging, auditing, and more. This article explains those options clearly, with practical examples, and without diving into undocumented or incorrect behavior.
What Is Dapper Plus and Why Options Matter
Dapper Plus is a high-performance bulk operations library designed to work seamlessly with Dapper. It focuses on doing one thing extremely well: moving large volumes of data efficiently.
Dapper Plus options define how bulk operations behave. Instead of relying on rigid defaults, options allow control over:
- How entities map to tables and columns
- How rows are inserted, updated, deleted, or merged
- How batching and performance are handled
- How logging, auditing, and validation are applied
Options can be applied either:
- At the entity mapping level, or
- At the connection / transaction level
This flexibility ensures bulk operations remain predictable and safe even in complex systems.
Entity Mapping and Table Configuration
Before any bulk operation can run, Dapper Plus must understand how an entity maps to a database table. This mapping step is fundamental — it defines where data goes, how columns are matched, and how rows are identified during insert, update, delete, or merge operations.
Dapper Plus uses explicit entity mapping to keep bulk operations fast, predictable, and safe.
Defining the Destination Table
By default, Dapper Plus assumes the entity name matches the database table name. When this is not the case, the table must be explicitly configured.
// @nuget: Z.Dapper.Plus
using Z.Dapper.Plus;
DapperPlusManager.Entity<Customer>()
.Table("Customers");This tells Dapper Plus exactly which table should be used for all bulk operations involving the Customer entity.
Mapping Entity Properties to Columns
Dapper Plus automatically maps properties to columns when names match. However, real-world databases often use different naming conventions. In those cases, explicit column mapping is required.
// @nuget: Z.Dapper.Plus
using Z.Dapper.Plus;
DapperPlusManager.Entity<Customer>()
.Map(c => c.Email, "EmailAddress");This ensures the Email property is written to the EmailAddress column in the database.
Explicit mapping is especially useful when:
- Working with legacy schemas
- Following snake_case or prefixed column naming
- Only a subset of properties should be persisted
Ignoring Properties
Not every property on an entity needs to be mapped. Computed values, UI-only fields, or derived properties should be excluded from bulk operations.
// @nuget: Z.Dapper.Plus
using Z.Dapper.Plus;
DapperPlusManager.Entity<Customer>()
.Ignore(c => c.FullName);Ignoring properties prevents accidental inserts or updates and keeps bulk operations focused only on relevant data.
Defining Keys
Keys play a critical role in Dapper Plus. They define how rows are matched during update, delete, and merge operations.
// @nuget: Z.Dapper.Plus
using Z.Dapper.Plus;
DapperPlusManager.Entity<Customer>()
.Key(c => c.Id);Identity Columns
Identity columns indicate values that are generated by the database, typically auto-incrementing primary keys
// @nuget: Z.Dapper.Plus
using Z.Dapper.Plus;
DapperPlusManager.Entity<Customer>()
.Identity(c => c.Id);This tells Dapper Plus not to send values for that column during inserts unless explicitly configured otherwise.
Insert Behavior Customization
InsertIfNotExists (Natural Keys)
For scenarios where uniqueness is enforced by a natural key rather than an identity column, Dapper Plus supports conditional inserts.
// @nuget: Z.Dapper.Plus
using Z.Dapper.Plus;
DapperPlusManager.Entity<Customer>()
.Key(c => c.Email)
.UseBulkOptions(opt => opt.InsertIfNotExists = true);This ensures rows are only inserted when they don’t already exist.
InsertKeepIdentity
By default, identity values are generated by the database. In migration or import scenarios, existing identity values may need to be preserved.
// @nuget: Z.Dapper.Plus
using Z.Dapper.Plus;
DapperPlusManager.Entity<Customer>()
.Identity(c => c.Id)
.UseBulkOptions(opt => opt.InsertKeepIdentity = true);This option allows identity values from the source to be respected.
Update and Merge Customization
Coalescing Values on Update
Sometimes updates should only apply when the source value is not null.
// @nuget: Z.Dapper.Plus
using Z.Dapper.Plus;
DapperPlusManager.Entity<Product>()
.Table("Product")
.Identity(x => x.Id)
.UseBulkOptions(opt => opt.CoalesceOnUpdateExpression = x => new { x.Name });
var connection = new SqlConnection(GetConnectionStringSqlServer());
connection.BulkUpdate(products);This approach is ideal for partial updates and incremental data enrichment.
Fine-Grained Merge Control
BulkMerge combines insert and update logic. Dapper Plus allows precise control over what happens in each phase.
// @nuget: Z.Dapper.Plus
using Z.Dapper.Plus;
DapperPlusManager.Entity<Product>()
.Table("Product")
.Identity(x => x.Id)
.UseBulkOptions(opt => {
opt.IgnoreOnMergeInsertExpression = y => new { y.LastUpdatedDate };
opt.IgnoreOnMergeUpdateExpression = y => new { y.CreatedDate };
});
var connection = new SqlConnection(GetConnectionStringSqlServer());
connection.BulkMerge(products);This ensures merge operations follow domain rules and avoid corrupting historical data.
Batch and Performance Options
Batching controls how many records are processed at once. Choosing the right batch size can significantly impact performance and memory usage.
Batch Size
// @nuget: Z.Dapper.Plus
using Z.Dapper.Plus;
connection.UseBulkOptions(opt =>
{
opt.BatchSize = 5000;
})
.BulkInsert(customers);Larger batches improve throughput, while smaller batches reduce memory pressure.
Batch Timeout
Long-running bulk jobs may require custom timeout settings.
// @nuget: Z.Dapper.Plus
using Z.Dapper.Plus;
connection.UseBulkOptions(opt =>
{
opt.BatchTimeout = 120;
})
.BulkMerge(products);This helps prevent premature failures during large data operations.
Logging and Diagnostics
Visibility into bulk operations is critical, especially in production systems.
Enabling Logging
// @nuget: Z.Dapper.Plus
using Z.Dapper.Plus;
connection.UseBulkOptions(opt =>
{
opt.Log = message => Console.WriteLine(message);
})
.BulkInsert(customers);This logs execution details, making troubleshooting far easier.
Log Dump for Post-Execution Analysis
Log dumps are particularly useful in background jobs or scheduled tasks.
// @nuget: Z.Dapper.Plus
using Z.Dapper.Plus;
connection.UseBulkOptions(opt =>
{
opt.UseLogDump = true;
})
.BulkUpdate(orders);
var logs = connection.BulkLogDump;Temporary Table Configuration
Dapper Plus uses temporary tables internally for some operations. These behaviors can be customized.
In-Memory Temporary Tables
In-memory temporary tables can improve performance for medium-sized datasets.
// @nuget: Z.Dapper.Plus
using Z.Dapper.Plus;
connection.UseBulkOptions(opt =>
{
opt.TemporaryTableIsMemory = true;
})
.BulkMerge(products);Custom Temporary Table Schema
This is useful in environments with strict database permissions.
// @nuget: Z.Dapper.Plus
using Z.Dapper.Plus;
connection.UseBulkOptions(opt =>
{
opt.TemporaryTableSchemaName = true;
})
.BulkMerge(products);Auditing and Result Tracking
Audit Tracking
Dapper Plus can record detailed audit information about bulk operations.
// @nuget: Z.Dapper.Plus
using Z.Dapper.Plus;
List<AuditEntry> auditEntries = [];
connection.UseBulkOptions(opt =>
{
opt.UseAudit = true;
opt.AuditEntries = auditEntries;
})
.BulkDelete(customers);
Console.WriteLine($"Total Audit Entries: {auditEntries.Count}");Audit data is invaluable for compliance, debugging, and monitoring.
Rows Affected
// @nuget: Z.Dapper.Plus
using Z.Dapper.Plus;
ResultInfo resultInfo = new();
connection.UseBulkOptions(opt =>
{
opt.ResultInfo = resultInfo;
opt.UseRowsAffected = true;
})
.BulkDelete(customers);
Console.WriteLine($"Row Affected: {resultInfo.RowsAffected}");
Console.WriteLine($"Row Affected Deleted: {resultInfo.RowsAffectedDeleted}");This confirms exactly how many rows were impacted.
Validation and Safety Options
Validation options help catch configuration issues early.
Validate All Source Mapped
// @nuget: Z.Dapper.Plus
using Z.Dapper.Plus;
DapperPlusManager.Entity<Customer>()
.UseBulkOptions(opt => opt.ValidateAllSourceMapped = true);This ensures every source property is properly mapped.
Validate No Duplicate Keys
// @nuget: Z.Dapper.Plus
using Z.Dapper.Plus;
DapperPlusManager.Entity<Customer>()
.UseBulkOptions(opt => opt.ValidateNoDuplicateKey = true);This protects against accidental duplicate key scenarios, especially during merges.
Summary
Dapper Plus options give developers full control over bulk data operations. By configuring mappings, keys, batching, logging, auditing, and validation correctly, it becomes possible to handle large datasets efficiently without sacrificing correctness or maintainability.
Takeaways
- Options define how Dapper Plus behaves
- Mapping and keys are critical for correctness
- Batch tuning improves performance
- Logging and auditing provide visibility
- Validation prevents silent data issues
