Auditing with EF Core and Sql Server – Part 2: Triggers, Session context and dependency injection

In part 1 I looked at using temporal tables to cover auditing requirements. Although a great feature, it did not quite cover the scenario I was after. I wanted a solution that did not impact my .NET code or at least keep the necessary change as small as possible.

So I decided to use a SQL feature I tend to avoid. Triggers. There I’ll say it again: triggers.

They are very powerful, but can be misused. Certainly in legacy systems where people went all out with them, you can be spending quite some time looking into where your SQL statements are actually processed or why your delete is doing an update. But for my current use case they seemed to be the right fit. It’s a feature of the datastore and it would keep all logic out of my code.

So first I created an AuditInfo class in my codebase so I could generate a migration with the create table statement and, should I need it, be able to query it in code.

public class AuditInfo
{
    public long Id { get; set; }
    public string SchemaName { get; set; }
    public string TableName { get; set; }
    public long RowId { get; set; }
    public DateTime ActionDate { get; set; }
    public string Action { get; set; }
    public string Values { get; set; }
    public string ModifiedBy { get; set; }
}

Short and simple. In my current project I gave every entity an Id, even if I could use a composite key. It just makes things a lot easier and means I can get the history of a specific row in an easy way. I also created an extension method to add an audit trigger on every entity.

public static class EnableAuditTriggerExtensions
{
	public static void EnableAuditTrigger(this MigrationBuilder migrationBuilder, IEntityType entityType)
	{
		var tableName = entityType.Relational().TableName;
		var schemaName = entityType.Relational().Schema ?? "dbo";
		migrationBuilder.Sql($@"
IF EXISTS (SELECT * FROM sys.triggers WHERE object_id = OBJECT_ID(N'[{schemaName}].[{tableName}_Audit]'))
DROP TRIGGER [{schemaName}].[{tableName}_Audit]");
		migrationBuilder.Sql($@"
CREATE TRIGGER [{schemaName}].[{tableName}_Audit] ON  [{schemaName}].[{tableName}]
AFTER INSERT, DELETE, UPDATE
AS 
BEGIN
 
SET NOCOUNT ON;
DECLARE @Action AS CHAR(1)
DECLARE @Id AS BIGINT
DECLARE @Values AS NVARCHAR(MAX)
SET @Action = NULL
SET @Id = NULL
SET @Values = NULL
 
IF EXISTS (SELECT * FROM inserted)
BEGIN
	IF EXISTS(SELECT * FROM deleted)
	BEGIN
		SET @Action = 'U'
	END
	ELSE
	BEGIN
		SET @Action = 'I'
	END
END
ELSE IF EXISTS(SELECT * FROM deleted)
BEGIN
	SET @Action = 'D'
END
IF @Action = 'D'
BEGIN
	INSERT INTO dbo.AuditInfo (
		SchemaName,
		TableName,
		RowId,
                ActionDate,
		ModifiedBy,
		[Action],
		[Values])
		SELECT
		'{schemaName}',
		'{tableName}',
		d.Id,
                SYSUTCDATETIME(),
		(SELECT CONVERT(nvarchar(250), SESSION_CONTEXT(N'CurrentApplicationUser'))),
		@Action,
		(SELECT * FROM deleted vals where vals.Id = d.Id FOR JSON PATH)
		FROM deleted d
END
ELSE
BEGIN
	INSERT INTO dbo.AuditInfo (
		SchemaName,
		TableName,
		RowId,
                ActionDate,
		ModifiedBy,
		[Action],
		[Values])
		SELECT
		'{schemaName}',
		'{tableName}',
		i.Id,
                SYSUTCDATETIME(),
		(SELECT CONVERT(nvarchar(250), SESSION_CONTEXT(N'CurrentApplicationUser'))),
		@Action,
		(SELECT * FROM inserted vals where vals.Id = i.Id FOR JSON PATH)
		FROM inserted i
END
END");
	}
}

Since you get access to the deleted and inserted values in a trigger I can inspect it and insert the correct values in my audit table. It’s not that complicated.

You can also see that I’m using “SESSION_CONTEXT”. All the users connect to the database with the same connection string, since I want to use connection pooling, but I want to log the correct application user along with the changes they made. There are several ways you can solve this but using “SESSION_CONTEXT” gives me an easy way to share information between my application and its database. It is a new feature in SQL Server 2016 and can be used with SQL Azure. SESSION_CONTEXT allows you to associate key value pairs to your current SQLConnection.

Won’t this then fail because you are using the same connection for multiple users? No because if you review the log of what happens to your SQL Server you’ll notice that “sp_reset_connection” gets called. This does a couple of things, including clearing whatever is in the SESSION_CONTEXT. I tried hard to find it being explicitly mentioned on MSDN but could not find any reference. This article proved valuable to clear up some of the underlying functionality. Since the stored procedure clears “CONTEXT_INFO” which I could also have used but is a bit more cumbersome to work with.

The migration itself is then pretty straightforward. Again the same caveat as before, many to many associations need to be modelled as an entity using this approach as I iterate across all entities to add the trigger to the underlying table.

public partial class LogWithTrigger : Migration
{
    protected override void Up(MigrationBuilder migrationBuilder)
    {
        foreach (var entity in TargetModel.GetEntityTypes())
        {
            if (entity.Name != typeof(AuditInfo).FullName)
            {
                migrationBuilder.EnableAuditTrigger(entity);
            }
        }
    }
}

So what remained is to actually call the stored procedure to add the current username to the key (‘CurrentApplicationUser’) that’s being referenced in my trigger. I had to dig a bit into how entity framework works to get this working as I need control over the actual connection. I could just create a SqlConnection and give that to the DbContext. If you do this Entity Framework will just use the connection you supply, however I wanted to use dependency injection. I subclassed SqlServerConnection to get the additional behaviour.

This is the version for EFCore < 2.0:

public class PrincipalAwareConnection
    : SqlServerConnection
{
    private IPrincipal principal;
 
    public PrincipalAwareConnection(IDbContextOptions options, ILogger<SqlServerConnection> logger, IPrincipal principal)
        : base(options, logger)
    {
        this.principal = principal;
    }
 
    public override async Task OpenAsync(CancellationToken cancellationToken = default(CancellationToken))
    {
        bool setSessionState = this.DbConnection.State != ConnectionState.Open;
        await base.OpenAsync();
        if (setSessionState)
        {
            SetSessionInfo();
        }
    }
 
    private void SetSessionInfo()
    {
        using (var cmd = base.DbConnection.CreateCommand())
        {
            cmd.Transaction = base.CurrentTransaction?.GetDbTransaction();
            cmd.CommandType = System.Data.CommandType.StoredProcedure;
            cmd.CommandText = $"sp_set_session_context";
            var keyParameter = cmd.CreateParameter();
            cmd.Parameters.Add(keyParameter);
            keyParameter.ParameterName = "key";
            keyParameter.Value = "CurrentApplicationUser";
            var valueParameter = cmd.CreateParameter();
            cmd.Parameters.Add(valueParameter);
            valueParameter.ParameterName = "value";
            valueParameter.Value = principal.Identity.Name;
            var readOnlyParameter = cmd.CreateParameter();
            cmd.Parameters.Add(readOnlyParameter);
            readOnlyParameter.ParameterName = "read_only";
            readOnlyParameter.Value = 1;
            cmd.ExecuteNonQuery();
        }
    }
 
    public override void Open()
    {
        bool setSessionState = this.DbConnection.State != ConnectionState.Open;
        base.Open();
        if (setSessionState)
        {
            SetSessionInfo();
        }
    }
}

This is the version for EFCore >= 2.0:

public class PrincipalAwareConnection
    : SqlServerConnection
{
    private IPrincipal principal;
 
    public PrincipalAwareConnection(RelationalConnectionDependencies dependencies, IPrincipal principal)
        : base(dependencies)
    {
        this.principal = principal;
    }
 
    public override async Task<bool> OpenAsync(CancellationToken cancellationToken, bool errorsExpected = false)
    {
        bool setSessionState = this.DbConnection.State != ConnectionState.Open;
        bool opened = await base.OpenAsync(cancellationToken, errorsExpected);
        if (setSessionState)
        {
            SetSessionInfo();
        }
        return opened;
    }
 
    private void SetSessionInfo()
    {
        using (var cmd = base.DbConnection.CreateCommand())
        {
            cmd.Transaction = base.CurrentTransaction?.GetDbTransaction();
            cmd.CommandType = System.Data.CommandType.StoredProcedure;
            cmd.CommandText = $"sp_set_session_context";
            var keyParameter = cmd.CreateParameter();
            cmd.Parameters.Add(keyParameter);
            keyParameter.ParameterName = "key";
            keyParameter.Value = "CurrentApplicationUser";
            var valueParameter = cmd.CreateParameter();
            cmd.Parameters.Add(valueParameter);
            valueParameter.ParameterName = "value";
            valueParameter.Value = principal.Identity.Name;
            var readOnlyParameter = cmd.CreateParameter();
            cmd.Parameters.Add(readOnlyParameter);
            readOnlyParameter.ParameterName = "read_only";
            readOnlyParameter.Value = 1;
            cmd.ExecuteNonQuery();
        }
    }
 
    public override bool Open(bool errorsExpected = false)
    {
        bool setSessionState = this.DbConnection.State != ConnectionState.Open;
        var opened = base.Open(errorsExpected);
        if (setSessionState)
        {
            SetSessionInfo();
        }
        return opened;
    }
}

In order to override the default SqlServerConnection and allow for the dependency injection of “IPrincipal” I also had to modify the “ConfigureServices” method in my “Startup.cs” class.

services.AddEntityFrameworkSqlServer();
services.AddTransient<ISqlServerConnection, PrincipalAwareConnection>();
services.AddDbContext<MyContext>((sp, options) =>
    options.UseSqlServer(Configuration.GetConnectionString("MyConnectionString"))
           .UseInternalServiceProvider(sp));

When adding, updating or deleting rows the audit table is now populated with values.

This snippet hooks up Entity Framework with the container you use in your application. So now whenever an entity is changed the audit table is filled with the correct values. Perhaps I’ll look into building a diff screen but for now I consider this the best fit for my goals. What do you think?

Auditing with EF Core and Sql Server – Part 1: Temporal tables

It’s part of a lot of projects in which I’m involved. Keeping track of data that is changed, when the change occurred and, the most important part, who made the change. Apart from tracking down unauthorised changes it also helps when users claim they didn’t make a change. They tend to forget changes they made. It’s a recurring requirement, but most of the time it’s there without the need for any further features. I’ve never had to create a diff screen or the like. We just need to keep track of all the changes to be on the safe side.

Most of the time you will see this being implemented with an interface that has to be implemented by your entities. The IAuditable interface with ModifiedByUser and ModifiedDate. It’s not that hard to implement and there are plenty of examples out there. Entity Framework has all the info you need, you can even get access to the old and new values of the properties and log them as well.

A quick Google search shows you how others are doing it, like here or even a library that keeps track of it. It’s not that hard to roll your own implementation and you’ll learn more about EF in the process of implementing it.

I always consider it to be noise. It is a cross cutting concern showing up in all your entities and then there is the additional code in the dbcontext which needs to look at the changed entities and their properties. For my current project I wanted to look at it from a different angle and keep everything out of my entities and if possible the DbContext.

I’m using SQL Azure (or SQL 2016) and had read about temporal tables last year and it seemed this would be a perfect fit. SQL server can automatically insert the validity date of data and their value at the time in a history table.

When enabling it on a table you need to add two columns for the validity period and the rest is taken care of by SQL Server. After inserting one record in the Products table, it will contain one row and its history table will be empty.

If I then issue an update statement, I get one row in the products table and one in its history table.

So far so good, we get an automatic history log. I was concerned that if I would add a column to the source table I would have to create an update to the corresponding history table but that is also handled by SQL Server. Migrations would just work. There are some restrictions however, as cascading deletes are no longer possible and would need to be handled by the application. You can find more limitations here.

To enable temporal tables on all my entities I created a couple of extension methods and added a migration to my code base.

public partial class EnableTemporalTable : Migration
{
    private string schemaName = "Logging";
    protected override void Up(MigrationBuilder migrationBuilder)
    {
        migrationBuilder.Sql($"CREATE SCHEMA {schemaName}");
        var entityTypes = base.TargetModel.GetEntityTypes();
        foreach (var entityType in entityTypes)
        {
            migrationBuilder.AddAsTemporalTable(entityType, schemaName);
        }
    }
 
    protected override void Down(MigrationBuilder migrationBuilder)
    {
        var entityTypes = base.TargetModel.GetEntityTypes();
        foreach (var entityType in entityTypes)
        {
            migrationBuilder.RemoveAsTemporalTable(entityType, schemaName);
        }
        migrationBuilder.Sql($"DROP SCHEMA {schemaName}");
    }
}

Note that I’m iterating all entities. If you have a many to many relationship you will have to model it as an association entity to have it also gain the temporal features.

using Microsoft.EntityFrameworkCore;
using Microsoft.EntityFrameworkCore.Metadata;
using Microsoft.EntityFrameworkCore.Migrations;
 
public static class EnableTemporalDataOnTable
{
    public static void AddAsTemporalTable(this MigrationBuilder migrationBuilder, IEntityType entityType, string temporalScheme, string temporalTableName)
    {
        var tableName = entityType.Relational().TableName;
        var schemaName = entityType.Relational().Schema ?? "dbo";
        migrationBuilder.Sql($@"
                    IF NOT EXISTS (SELECT * FROM sys.[tables] t INNER JOIN sys.schemas s ON s.schema_id = t.schema_id WHERE t.name = '{tableName}' AND temporal_type = 2 and s.name = '{schemaName}')
                    BEGIN
                        ALTER TABLE {schemaName}.{tableName}   
                        ADD  ValidFrom datetime2 (2) GENERATED ALWAYS AS ROW START HIDDEN    
                                constraint DF_{tableName}_ValidFrom DEFAULT DATEADD(second, -1, SYSUTCDATETIME())  
                            , ValidTo datetime2 (2)  GENERATED ALWAYS AS ROW END HIDDEN     
                                constraint DF_{tableName}_ValidTo DEFAULT '9999.12.31 23:59:59.99'  
                            , PERIOD FOR SYSTEM_TIME (ValidFrom, ValidTo);   
 
                        ALTER TABLE {schemaName}.{tableName}    
                        SET (SYSTEM_VERSIONING = ON (HISTORY_TABLE = {temporalScheme}.{temporalTableName})); 
                    END
                ");
 
    }
 
    public static void AddAsTemporalTable(this MigrationBuilder migrationBuilder, IEntityType entityType, string temporalScheme)
    {
        var tableName = entityType.Relational().TableName;
        AddAsTemporalTable(migrationBuilder, entityType, temporalScheme, tableName);
    }
 
    public static void RemoveAsTemporalTable(this MigrationBuilder migrationBuilder, IEntityType entityType, string temporalScheme, string temporalTableName)
    {
        var tableName = entityType.Relational().TableName;
        var schemaName = entityType.Relational().Schema ?? "dbo"; 
        string alterStatement = $@"ALTER TABLE {tableName} SET (SYSTEM_VERSIONING = OFF);";
        migrationBuilder.Sql(alterStatement);
        alterStatement = $@"ALTER TABLE {tableName} DROP PERIOD FOR SYSTEM_TIME";
        migrationBuilder.Sql(alterStatement);
        alterStatement = $@"ALTER TABLE {tableName} DROP DF_{tableName}_SysStart, DF_{tableName}_SysEnd";
        migrationBuilder.Sql(alterStatement);
        alterStatement = $@"ALTER TABLE {tableName} DROP COLUMN SysStartTime, COLUMN SysEndTime";
        migrationBuilder.Sql(alterStatement);
        alterStatement = $@"DROP TABLE {temporalScheme}.{temporalTableName}";
        migrationBuilder.Sql(alterStatement);
    }
 
    public static void RemoveAsTemporalTable(this MigrationBuilder migrationBuilder, IEntityType entityType, string temporalScheme)
    {
        var tableName = entityType.Relational().TableName;
        RemoveAsTemporalTable(migrationBuilder, entityType, temporalScheme, tableName);
    }
}

This is largely based on this blog post. I just adapted it to be a bit more generic. If I would create a migration for a new table I could just call the extension method and have it temporal in one go.

But there was still one missing piece. Who made the change? I have no property on my entities to store this information and thus no column in the table or history table to store that info. In one of the first versions of the article on the MSDN website that introduces temporal tables it was mentioned that you could track the user, but that reference has since been deleted. So custom code is necessary. I didn’t want to add the property to each of my entities so I used a new feature in EF core called shadow properties. Funny enough, the example on the MSDN site is about auditing.

So I created an override for the OnModelCreating and the SaveChanges and SaveChangesAsync method of the DbContext. So not completely honouring my initial goal to keep it out of my application code.

protected override void OnModelCreating(ModelBuilder modelBuilder)
{
    foreach (var entityType in modelBuilder.Model.GetEntityTypes())
    {
        modelBuilder.Entity(entityType.ClrType).Property<string>("ModifiedBy");
    }
}

The SaveChanges and SaveChangesAsync would call into this method and the principal was injected in the constructor of the DbContext.

public void SetModifiedInformation()
{
    var modifiedByEntries = ChangeTracker.Entries()
        .Where(e => e.State == EntityState.Added || e.State == EntityState.Modified || e.State == EntityState.Deleted);
 
    foreach (EntityEntry entry in modifiedByEntries)
    {
        entry.Property("ModifiedBy").CurrentValue = this.principal?.Identity?.Name;
    }
}
 
public override int SaveChanges()
{
    SetModifiedInformation();
    return base.SaveChanges();
}
 
public override async Task<int> SaveChangesAsync(CancellationToken cancellationToken = new CancellationToken())
{
    SetModifiedInformation();
    return await base.SaveChangesAsync(cancellationToken);
}

This worked, but one scenario is now missing in this solution. We know who created or updated a row, but with a delete we lose the value in the ModifiedBy property. The deleted row is copied from the source table to the history table. I considered the ultimate hack: create an instead of trigger on the table and issue an update and then a delete but luckily SQL server prevented me from even trying it. Instead of triggers are not possible with table that is temporal. I considered adding a separate table to keep track of who deleted a certain row, but this would again introduce more code than I wanted and clutter the codebase. I could introduce soft deletes so that a delete will become an update but that just opens another can of worms.

So I went back to the drawing board to find a better solution, which turned out to be a blast from the past combined with a new feature in SQL server 2016. But that’s for next time.

Migrating databases with EF Core 2.0 during a VSTS release

A while ago I wrote down how to migrate a database using the default tooling with EF Core 1.1. With 2.0 being released last August it was finally time to upgrade projects on which I’m working to the latest bits. A reader had already reported that the same approach as used with 1.1 no longer worked for EF Core 2.0.

Again I ran the update database command with the verbose option and I saw that the command itself has not changed. What has changed however is the location of EF.dll. Even when forcing the package restore to not use any caching and explicitly stating where the packages had to be installed, EF.dll just would not show up. It is however located (at least on my Mac) at /usr/local/share/dotnet/sdk/NuGetFallbackFolder/microsoft.entityframeworkcore.tools.dotnet/2.0.0/tools/netcoreapp2.0/ef.dll.

I tried for an hour or two to get this file in a zip archive on Windows and on macOS. The easy way out would be to just find the dll and add it to source control. Of course I went for another solution: download the EF.dll during a release and use it to update the databases.

I edited the release definition of VSTS and in my first attempt I tried to use the package management task which you get when you install this extension. Unfortunately this is not meant to be used to download public packages from the NuGet servers.

Second attempt was to use the Download file task. It was not able to download the NuGet package I needed giving me a 404 error.

So when everything fails, Powershell will fix it. With just a single line I was able to download the NuGet package containing EF.dll.

I used an inline script, but you could just create a script and add it to source control so you can use variables for the package name.

wget "https://www.nuget.org/api/v2/package/Microsoft.EntityFrameworkCore.Tools.DotNet/2.0.0" -outfile "EF.nupkg"

Then I extract all the files, since the build output contains one or more zip files and a nupkg file is also just a zip archive. My sample project just contained one zip.

Then move the EF.dll to the location where the assembly resides containing the DbContext and migrations. The final step is to execute the update command.

mv ./tools/netcoreapp2.0/ef.dll ./
dotnet exec --depsfile migratetest20.deps.json  --runtimeconfig migratetest20.runtimeconfig.json ef.dll database update --assembly migratetest20.dll --startup-assembly migratetest20.dll

Migratetest20 was my test project.

Migrating databases with EF Core during a VSTS release

I wanted to migrate databases during a release. The default way of doing this is to run code when the application starts. Great for little demo projects but not when you are releasing a real app to production. When you are pushing a new version to production and the migration fails the release should fail, alarms should go off etc. With the “old” Entity Framework I usually wrapped the migration code in a console application. I packaged it alongside the other code and then executed an additional release step to execute the migrations. Since I’ve been using .NET Core on my latest projects I wanted to see if I could use all the default tooling.

If you run the update command with the verbose option you will see that “dotnet exec” is called using “EF.dll” together with a bunch of other options. This is the behaviour I want to replicate during a release.

dotnet ef database update -v

The first thing to figure out is where to find “EF.dll”. This is of course in a NuGet package. By default they are downloaded in your user’s home folder. I was not able to figure out a way to just grab it from there using a hosted build agent. However by modifying the restore command I can specify where the packages need to be downloaded.

dotnet restore --packages /my/location

I then added an additional “Archive files” build step to my build definition. This step creates a zip file and adds it to the build artifacts staging directory. This means it will automatically be picked up with all the other deliverables.

Those are all the necessary changes for the build definition. Now the extra bits in the release definition. I need four additional steps, but it’s also important to select the correct agent. As I’ll be running .NET Core commands I need a VS2017 agent.

The first step is to unarchive the archive which has the appsettings.json with the connectionstring, in my case it’s the output of a project which houses the web api and an Angular app, the other archive we need has the EF Core bits we need.

I then need the correct connectionstring, this is of course a variable in my release environment. The easiest way to replace the connectionstring in appsettings.json of the unarchived files of the web application is to use an extension of VSTS. It’s a two step process, we first tokenize the file so the second step can find the placeholders.

The final step is to do the same command we execute on our development machine. The hard part is figuring out all the parameters. In my case the migrations (and all of the logic) is in a dll named: App.dll while the web application and the connectionstring is from another dll: App.Web.dll

exec --depsfile App.Web.deps.json --runtimeconfig App.Web.runtimeconfig.json ef.dll database update --assembly App.dll --startup-assembly App.Web.dll --verbose --root-namespace MyAppNamespace

And that’s it, now the database is upgraded during a release.