Centeva.Auditing
4.1.0
Prefix Reserved
See the version list below for details.
dotnet add package Centeva.Auditing --version 4.1.0
NuGet\Install-Package Centeva.Auditing -Version 4.1.0
<PackageReference Include="Centeva.Auditing" Version="4.1.0" />
<PackageVersion Include="Centeva.Auditing" Version="4.1.0" />
<PackageReference Include="Centeva.Auditing" />
paket add Centeva.Auditing --version 4.1.0
#r "nuget: Centeva.Auditing, 4.1.0"
#:package Centeva.Auditing@4.1.0
#addin nuget:?package=Centeva.Auditing&version=4.1.0
#tool nuget:?package=Centeva.Auditing&version=4.1.0
Centeva.Auditing
This library provides audit entity models and Entity Framework Core configurations for SQL Server auditing. It is designed for use with EF Core (.NET 8, C# 12) and is intended to be ingested via NuGet by other applications.
Features
- Audit & AuditDetail Models: Standardized entities for audit logging.
- EntityTypeConfigurations: Fluent API configurations for
AuditandAuditDetailtables. - AuditIgnore Helpers: Utilities for ignoring tables, schemas, or columns during audit operations.
- AuditScriptCreator: Generates SQL scripts for audit triggers, supporting both single-table and dual-table audit logging.
- AuditMigrationHelper: Automated migration utility to convert existing single-table audit data to dual-table architecture.
- Convention Support: Includes conventions for automatic trigger creation in EF Core.
Audit Trigger Script Generation
Single-table vs. Dual-table Audit Logging
The AuditScriptCreator supports two modes for audit logging:
Single-table mode (
UseSingleTable = true):
All audit data, including field-level changes, is logged directly to theAudittable.
This mode is intended for legacy applications and is not recommended for new development.Dual-table mode (
UseSingleTable = false):
Summary audit data is logged to theAudittable, and field-level changes are logged to theAuditDetailtable.
This is the recommended approach for new applications.
Configuring Audit Mode
Set the UseSingleTable property in AuditConfig to select the desired mode:
using Centeva.Auditing.Helpers;
var config = new AuditConfig( schema: "dbo", auditTable: "Audit", auditDetailTable: "AuditDetail", connectionString: "your-connection-string" )
{
UseSingleTable = false // Recommended for new applications
};
Note: Setting
UseSingleTable = trueis only supported for legacy applications. For new applications, set this tofalse.
Generating Trigger Scripts
Use the AuditScriptCreator class to generate SQL scripts for audit triggers.
using Centeva.Auditing.Helpers;
var config = new AuditConfig( schema: "dbo", auditTable: "Audit", auditDetailTable: "AuditDetail", connectionString: "your-connection-string" )
{
UseSingleTable = false // Recommended for new applications
};
using var creator = new AuditScriptCreator(config);
foreach (var script in creator.GetTriggerScripts(/* optional AuditIgnore rules */))
{
// Execute script against your database
}
Note: The
AuditScriptCreatorclass manages the connection lifecycle internally and implementsIDisposable. Always use ausingstatement or callDispose()when finished.
Migrating from Single-Table to Dual-Table Architecture
The AuditMigrationHelper provides automated migration from single-table to dual-table audit architecture for legacy applications. This utility handles schema changes, data migration, trigger updates, and cleanup in a single operation.
Migration Process Overview
The migration performs the following steps automatically:
- Schema Creation: Creates the
AuditDetailtable with appropriate indexes and foreign key constraints. - Data Migration: Migrates field-level audit data (
FieldName,OldValue,NewValue) from theAudittable to the newAuditDetailtable in configurable batches. - Trigger Updates: Drops existing single-table triggers and creates new dual-table triggers.
- Cleanup: Removes obsolete columns (
FieldName,OldValue,NewValue) from theAudittable.
Using AuditMigrationHelper
using Centeva.Auditing.Helpers;
using Microsoft.Extensions.Logging;
var config = new AuditConfig(
schema: "dbo",
auditTable: "Audit",
auditDetailTable: "AuditDetail",
connectionString: "your-connection-string"
);
// Optional: Create a logger for progress tracking
var loggerFactory = LoggerFactory.Create(builder => builder.AddConsole());
var logger = loggerFactory.CreateLogger<Program>();
try
{
await AuditMigrationHelper.MigrateSingleToDualTable(
config: config,
logger: logger,
batchSize: 100_000, // Optional: default is 100,000 rows per batch
cancellationToken: CancellationToken.None
);
Console.WriteLine("Migration completed successfully!");
}
catch (Exception ex)
{
Console.WriteLine($"Migration failed: {ex.Message}");
}
Migration Features
- Resumable Migration: If the migration is interrupted, it can be safely re-run. The helper tracks the last migrated
AuditIdand resumes from that point. - Batch Processing: Large datasets are processed in configurable batches to avoid memory issues and long-running transactions.
- Safe Cleanup: Obsolete columns are only dropped after confirming all data has been successfully migrated.
- Logging Support: Optional
ILoggerparameter provides detailed progress tracking and error reporting. - Transaction Safety: Internal operations use transactions where appropriate to maintain data consistency.
Migration Considerations
Before Running Migration:
- Backup Your Database: Always create a full database backup before running the migration.
- Schedule Downtime: While the migration can run on a live database, consider scheduling maintenance windows for large datasets to avoid performance impact.
- Test on Non-Production: Test the migration on a development or staging environment first.
- Review Disk Space: Ensure adequate disk space for the new
AuditDetailtable, which will initially duplicate field-level data.
During Migration:
- The migration processes data in batches to minimize locking and memory usage.
- Existing audit data in the
Audittable is marked as migrated by settingFieldName,OldValue, andNewValuetoNULL. - Progress is logged at configurable intervals (via the
ILoggerparameter).
After Migration:
- Verify that all audit data has been successfully migrated by querying both tables.
- Update your application's
AuditConfigto setUseSingleTable = false. - Monitor your application to ensure dual-table triggers are working correctly.
- The obsolete columns (
FieldName,OldValue,NewValue) are automatically dropped from theAudittable.
Re-running Failed Migrations
If a migration fails or is interrupted:
- The
AuditDetailtable and any migrated data remain intact. - Unmigrated records in the
Audittable still have theirFieldNamevalues populated. - Simply re-run
MigrateSingleToDualTablewith the same configuration�it will resume from where it left off. - Obsolete columns will only be dropped once all data is successfully migrated.
Example: Migration with Progress Tracking
using Centeva.Auditing.Helpers;
using Microsoft.Extensions.Logging;
var config = new AuditConfig(
schema: "dbo",
auditTable: "Audit",
auditDetailTable: "AuditDetail",
connectionString: "Server=localhost;Database=MyApp;Integrated Security=true;TrustServerCertificate=true"
);
var loggerFactory = LoggerFactory.Create(builder =>
{
builder
.AddConsole()
.SetMinimumLevel(LogLevel.Information);
});
var logger = loggerFactory.CreateLogger("AuditMigration");
await AuditMigrationHelper.MigrateSingleToDualTable(
config: config,
logger: logger,
batchSize: 50_000, // Process 50,000 rows at a time
cancellationToken: CancellationToken.None
);
Getting Started
1. Install the Package
Add the NuGet package to your project:
dotnet add package Centeva.Auditing
2. Add Entity Configurations to Your DbContext
In your application's DbContext, register the provided configurations in OnModelCreating:
using Centeva.Auditing.Configurations;
protected override void OnModelCreating(ModelBuilder modelBuilder)
{
// For single-table audit logging:
modelBuilder.ApplyConfiguration(new AuditSingleTableConfiguration(auditSchema, auditTable));
// For dual-table audit logging:
modelBuilder.ApplyConfiguration(new AuditDualTableConfiguration(auditSchema, auditTable));
modelBuilder.ApplyConfiguration(new AuditDetailConfiguration());
// Register other configurations as needed
}
3. Use the Audit Models
Reference the AuditSingleTable or AuditDualTable and AuditDetail entities in your code as needed:
using Centeva.Auditing.Models;
// Example usage for single-table audit logging
public DbSet<AuditSingleTable> Audit { get; set; }
// OR ... for dual-table audit logging
public DbSet<AuditDualTable> Audit { get; set; }
public DbSet<AuditDetail> AuditDetail { get; set; }
Customizing Table and Schema Names
You can configure the schema and table names for the Audit and AuditDetail entities at runtime, allowing you to match your existing database without requiring a migration.
1. Add Settings to appsettings.json
{
"Auditing": {
"AuditSchema": "dbo",
"AuditTable": "Audit",
"AuditDetailTable": "AuditDetail"
}
}
2. Pass Settings to Configuration Classes
Update your DbContext to read these values and pass them to the configuration classes:
using Microsoft.Extensions.Configuration;
using Centeva.Auditing.Configurations;
public class YourDbContext(DbContextOptions<YourDbContext> options, IConfiguration configuration) : DbContext(options)
{
private readonly IConfiguration _configuration = configuration;
protected override void OnModelCreating(ModelBuilder modelBuilder)
{
var auditSchema = _configuration["Auditing:AuditSchema"] ?? "dbo";
var auditTable = _configuration["Auditing:AuditTable"] ?? "Audit";
var auditDetailTable = _configuration["Auditing:AuditDetailTable"] ?? "AuditDetail";
// For single-table audit logging:
modelBuilder.ApplyConfiguration(new AuditSingleTableConfiguration(auditSchema, auditTable));
// For dual-table audit logging:
modelBuilder.ApplyConfiguration(new AuditDualTableConfiguration(auditSchema, auditTable));
modelBuilder.ApplyConfiguration(new AuditDetailConfiguration(auditSchema, auditDetailTable));
}
}
Note: The configuration classes (
AuditSingleTableConfiguration,AuditDualTableConfiguration,AuditDetailConfiguration) must accept schema and table names as constructor parameters.
3. No Migration Needed
As long as the schema and table names you specify match your existing database, no migration is required.
Configuration Reference
AuditConfig
- Controls schema, table names, connection string, and audit mode.
UseSingleTable:true— All audit data goes to theAudittable (legacy support).false— Summary data goes toAudittable, field-level changes toAuditDetailtable (recommended).
AuditSingleTableConfiguration
- Maps the
AuditSingleTableentity to theAudittable. - Configures columns, types, and relationships.
AuditDualTableConfiguration
- Maps the
AuditDualTableentity to theAudittable. - Configures columns, types, and relationships.
AuditDetailConfiguration
- Maps the
AuditDetailentity to theAuditDetailtable. - Configures columns, types, and relationships.
AuditIgnore Helpers
- Use
AuditIgnore.Create(schema),AuditIgnore.Create(schema, table), orAuditIgnore.Create(schema, table, column)to specify audit exclusions.
Example: Dependency Injection Setup
If you use dependency injection, register your DbContext and any audit-related services as usual:
services.AddDbContext<YourDbContext>(options => options.UseSqlServer(Configuration.GetConnectionString("DefaultConnection")));
Usage Notes
- Migrations: If you change table names in the configuration, EF Core will generate a migration to rename the table. Table renames are fast metadata operations in SQL Server.
- Column Types: The default configuration uses
nvarchar(max)forOldValueandNewValueto support large audit values. - Extensibility: You can extend the models or configurations as needed for your application.
- Legacy Support: Single-table audit logging is available for legacy applications. For new development, dual-table logging is recommended.
Compatibility
- .NET 8
- C# 12
- Microsoft.EntityFrameworkCore 8.x
Migrating from Single-Table to Dual-Table Audit Logging
If your application currently uses single-table audit logging, you can migrate to the recommended dual-table approach using the AuditMigrationHelper utility.
📚 For comprehensive migration documentation, see the Migration Documentation Index which includes:
- Complete step-by-step migration guide
- Performance optimization details
Why Migrate?
The dual-table architecture provides:
- Better Performance: Separating audit metadata from field-level changes improves query performance
- Scalability: More efficient for databases with hundreds of millions of audit records
- Maintainability: Cleaner data model with proper normalization
- Flexibility: Easier to query audit events without field-level details
Migration Process
The migration utility handles:
- ✅ Schema Creation: Creates the
AuditDetailtable with proper indexes and foreign keys - ✅ Data Migration: Migrates existing audit data in batches (default: 100,000 records)
- ✅ Trigger Updates: Regenerates all audit triggers to use dual-table format
- ✅ Cleanup: Drops obsolete columns (
FieldName,OldValue,NewValue) from theAudittable - ✅ Resumability: Tracks progress and can resume after interruption
- ✅ Minimal Locking: Uses
READPASTand bulk insert for optimal performance
Prerequisites
Before migrating, ensure:
- ⚠️ Backup your database - Always backup before running migration
- ✅ Database permissions - User needs
CREATE TABLE,ALTER TABLE,INSERT,UPDATEpermissions - ✅ Maintenance window - Plan for downtime during trigger updates (typically < 5 minutes)
- ✅ Disk space - Ensure sufficient space for
AuditDetailtable (approximately same size as currentAudittable)
Basic Migration
using Centeva.Auditing.Helpers;
using Microsoft.Extensions.Logging;
// Configure audit settings
var config = new AuditConfig(
schema: "dbo",
auditTable: "Audit",
auditDetailTable: "AuditDetail",
connectionString: "your-connection-string",
alwaysUpdateTriggers: true
);
// Run migration (basic usage without logging)
await AuditMigrationHelper.MigrateSingleToDualTable(config);
Console.WriteLine("Migration completed successfully!");
Migration with Logging (Recommended)
For production environments, use an ILogger to track progress:
using Centeva.Auditing.Helpers;
using Microsoft.Extensions.Logging;
// Setup logging
var loggerFactory = LoggerFactory.Create(builder =>
{
builder.AddConsole();
builder.SetMinimumLevel(LogLevel.Information);
});
var logger = loggerFactory.CreateLogger("AuditMigration");
// Configure audit settings
var config = new AuditConfig(
schema: "dbo",
auditTable: "Audit",
auditDetailTable: "AuditDetail",
connectionString: Configuration.GetConnectionString("DefaultConnection"),
alwaysUpdateTriggers: true
);
try
{
// Run migration with progress logging
await AuditMigrationHelper.MigrateSingleToDualTable(
config,
logger: logger,
batchSize: 100_000, // Process 100K records per batch
cancellationToken: cancellationToken
);
logger.LogInformation("Migration completed successfully!");
}
catch (InvalidOperationException ex) when (ex.Message.Contains("already been completed"))
{
logger.LogWarning("Migration was already completed previously.");
}
catch (Exception ex)
{
logger.LogError(ex, "Migration failed: {Message}", ex.Message);
throw;
}
Customizing Batch Size and Performance
For databases with limited resources or high activity, adjust the batch size:
// Smaller batches = less memory, more frequent commits
await AuditMigrationHelper.MigrateSingleToDualTable(
config,
logger: logger,
batchSize: 50_000 // Smaller batches for resource-constrained environments
);
// Larger batches = better performance, more memory
await AuditMigrationHelper.MigrateSingleToDualTable(
config,
logger: logger,
batchSize: 250_000 // Larger batches for high-performance systems
);
Maximum Performance for Very Large Databases (100M+ records)
For databases with 100M+ records, you can explicitly defer foreign key creation:
// Explicit performance mode - defers FK creation until after bulk insert
await AuditMigrationHelper.MigrateSingleToDualTable(
config,
logger: logger,
batchSize: 100_000,
deferForeignKey: true, // Explicitly defer FK for maximum performance
cancellationToken: cancellationToken
);
Auto-Detection: The migration automatically detects databases with >50M records and defers FK creation for optimal performance. The
deferForeignKeyparameter allows explicit control when needed.
Performance Comparison (200M records):
- With immediate FK: ~40 hours
- With deferred FK: ~25 hours (37% faster)
Audit Logging Gap
Important: During data migration (Step 5), audit triggers are temporarily disabled to prevent data inconsistency. This creates an audit logging gap.
- Duration: Depends on database size (hours for 100M+ records)
- Recommendation: Run during off-peak hours or maintenance window
- Impact: No audit records captured during bulk insert phase
The migration will log clear warnings when audit logging is disabled and re-enabled.
// Smaller batches = less memory, more frequent commits
await AuditMigrationHelper.MigrateSingleToDualTable(
config,
logger: logger,
batchSize: 50_000 // Smaller batches for resource-constrained environments
);
// Larger batches = better performance, more memory
await AuditMigrationHelper.MigrateSingleToDualTable(
config,
logger: logger,
batchSize: 250_000 // Larger batches for high-performance systems
);
Resumable Migration
The migration automatically tracks progress. If interrupted, it will resume from where it left off:
// First attempt - processes 500K records, then fails
await AuditMigrationHelper.MigrateSingleToDualTable(config, logger);
// Output: "Batch 1 completed: 100000/1000000 records migrated (10.00%)"
// Output: "Batch 2 completed: 200000/1000000 records migrated (20.00%)"
// ... (interrupted at 500K)
// Second attempt - resumes from 500K
await AuditMigrationHelper.MigrateSingleToDualTable(config, logger);
// Output: "Resuming migration from AuditId: 500000"
// Output: "Total records already migrated: 500000"
// Output: "Batch 6 completed: 100000/500000 records migrated (20.00%)"
// ... (continues from where it left off)
Migration Output
The migration provides detailed progress information:
[Information] Starting migration from single-table to dual-table audit logging...
[Information] Batch size: 100000
[Information] Step 1/6: Validating preconditions...
[Information] Preconditions validated successfully.
[Information] Step 2/6: Creating migration state table...
[Information] Migration state table created successfully.
[Information] Step 3/6: Creating AuditDetail table...
[Information] AuditDetail table created successfully with indexes and foreign keys.
[Information] Step 4/6: Migrating audit data in batches...
[Information] Resuming migration from AuditId: 0
[Information] Total records already migrated: 0
[Information] Total records remaining to migrate: 1543892
[Information] Batch 1 completed: 100000/1543892 records migrated (6.48%) in 12.34s
[Information] Batch 2 completed: 200000/1543892 records migrated (12.96%) in 11.87s
...
[Information] Data migration completed. Total records migrated: 1543892
[Information] Step 5/6: Updating audit triggers...
[Information] Updated 47 trigger scripts to dual-table mode.
[Information] Step 6/6: Dropping obsolete columns...
[Information] Dropping 3 obsolete columns: FieldName, OldValue, NewValue
[Information] Obsolete columns dropped successfully.
[Information] Migration state marked as complete.
[Information] Migration completed successfully!
Post-Migration Steps
After successful migration:
- Update DbContext: Switch from
AuditSingleTabletoAuditDualTable
// Before migration
public DbSet<AuditSingleTable> Audit { get; set; }
// After migration
public DbSet<AuditDualTable> Audit { get; set; }
public DbSet<AuditDetail> AuditDetail { get; set; }
- Update Entity Configurations: Use dual-table configurations
protected override void OnModelCreating(ModelBuilder modelBuilder)
{
// Remove single-table configuration
// modelBuilder.ApplyConfiguration(new AuditSingleTableConfiguration(auditSchema, auditTable));
// Add dual-table configurations
modelBuilder.ApplyConfiguration(new AuditDualTableConfiguration(auditSchema, auditTable));
modelBuilder.ApplyConfiguration(new AuditDetailConfiguration(auditSchema, auditDetailTable));
}
- Verify Migration: Query both tables to ensure data integrity
// Verify audit event counts match
var auditCount = await dbContext.Audit.CountAsync();
var detailCount = await dbContext.AuditDetail.CountAsync();
Console.WriteLine($"Audit events: {auditCount}");
Console.WriteLine($"Audit details: {detailCount}");
- Clean Up: Optionally drop the migration state table
-- After verifying successful migration
DROP TABLE [dbo].[__AuditMigrationState];
Troubleshooting
Migration fails with "Audit table does not exist"
- Verify the schema and table name in
AuditConfigmatch your database
Migration fails with "already been completed"
- This is expected if you re-run migration on an already-migrated database
- The migration is idempotent and safe to re-run
Migration is slow (> 1 hour for 10M records)
- Reduce batch size to minimize memory usage:
batchSize: 50_000 - Ensure database has adequate resources (CPU, memory, disk I/O)
- Run during off-peak hours
- Consider temporarily disabling non-essential indexes
Migration interrupted and won't resume
- Check the
__AuditMigrationStatetable for last migrated AuditId - Manually delete state table row to restart from beginning if needed
Performance Expectations
Typical migration performance on SQL Server:
| Record Count | Batch Size | Estimated Time | Notes |
|---|---|---|---|
| 1M records | 100K | 5-10 minutes | Standard configuration |
| 10M records | 100K | 30-60 minutes | May vary by server specs |
| 100M records | 100K | 5-10 hours | Run during maintenance window |
| 500M+ records | 50K | 24+ hours | Consider batch size tuning |
Optimization Tips:
- Use SSDs for faster I/O
- Increase batch size on powerful servers
- Run during off-peak hours
- Monitor
tempdbusage
TODO
- Refactor
AuditScriptCreatorto implement an interface and register as a service for improved testability and dependency injection support.
Contributing
Contributions and feedback are welcome. Please open issues or submit pull requests for improvements.
License
Copyright © 2019 Centeva
| Product | Versions Compatible and additional computed target framework versions. |
|---|---|
| .NET | net8.0 is compatible. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. net9.0 was computed. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. net10.0 was computed. net10.0-android was computed. net10.0-browser was computed. net10.0-ios was computed. net10.0-maccatalyst was computed. net10.0-macos was computed. net10.0-tvos was computed. net10.0-windows was computed. |
-
net8.0
- Microsoft.CSharp (>= 4.7.0)
- Microsoft.Data.SqlClient (>= 6.1.4)
- Microsoft.EntityFrameworkCore (>= 8.0.19)
- Microsoft.EntityFrameworkCore.Relational (>= 8.0.19)
- System.ComponentModel.Annotations (>= 5.0.0)
- System.Data.DataSetExtensions (>= 4.5.0)
NuGet packages (2)
Showing the top 2 NuGet packages that depend on Centeva.Auditing:
| Package | Downloads |
|---|---|
|
Centeva.DatabaseMaintenance
Database update and audit trigger management for .NET applications. |
|
|
Centeva.AuditReverter
Centeva Package used on audit data in SQL Server to revert changes for Integration tests |
GitHub repositories
This package is not used by any popular GitHub repositories.