Blog

.NET Core MsSQL integration-testing, one wrong way and one right way

Morten Nørgaard Senior Software Developer, Solita

Published 29 Dec 2023

Reading time 6 min

I’ve encountered many approaches towards the topic of integration testing, in regard to various MsSQL-projects I’ve worked on, for various organisations. It’s a topic that comfortably lends itself to debate: When can an integration test truly be deemed worthy of that moniker? 

For some the baseline is a full-fledged image of the production database, others prefer the ease of a simple localDB-instance or SQLite in-memory database. Microsoft has some pointers on several different approaches, worth checking out.

For the purpose of this article, however, let’s define what constitutes an integration test: a test that is run against the same database executable as runs in the production environment. And lets implement such a test, without activating cloud- or on-premise resources, nor installing any local MsSQL developer (or other) edition.

All of the demonstrated code is available from the GitHub repository above.

We’ll be integration-testing this repository…

… where ‘ShopContext’ is a standard Entity Framework Core IDbContext-implementation.

First things first, let’s take a stand in regards to integration testing against the same database-executable, i.e. implement an integration test that only seemingly works but in reality hides an issue. We’ll use SQLite, which is not uncommonly used for integration-testing purposes. Here’s a test implementation:

In the above test, we construct the Entity Framework context as required for the CustomerRepository, via the ‘UseSqlite’ extension method and a suitable ‘in-memory’ connection string. When used in this fashion, SQLite drops the database entirely when the connection is no longer available, wherefore we open that connection in the SetupTest() and close&dispose it in CleanupTest(). The TestInsertCustomer() will then happily execute:

However, all is not as rosy as would appear. The customer-model appears thus:

I.e. the MaxLength-attribute imposes a limit of 30 characters to the name-property, but in our SQLite-based test we venture to add a bit more than that:

The test runs just fine – but it should not. It should fail. If we relied solely on SQLite for integration tests, we’d potentially, on the basis of the above, release an unreliable repository into production. So here’s certainly one argument for always integration-testing against the same database executable as runs on the environment we’re testing; an MsSQL-server would never allow for the above, as we shall soon prove by running a similar test against a ‘real’ MsSQL-server in the form of a containerised version. Two resources are needed: an official Microsoft SQL Server container image and the Testcontainers for .NET NuGet-package, specifically the MsSQL-variant:

For the unfamiliar, Testcontainers does away with a lot of tedious setups otherwise involved in these matters. Given an available OCI-compatible (“Open Container Initiative“) container runtime on your system, be it Docker or Podman or similar, you will be able to just retrieve an image and run your integration test against a container off of it.

Podman offers an open-source and license-free convenience alternative to Docker that appeals to yours truly, so that’s what I’ll demonstrate here. You could certainly, and just as easily, use Docker.

Let’s implement a different integration test, one that utilizes the abovementioned NuGet package:

In the above, we initialise and build the MsSqlContainer, and put in a call to StartAsync() to render it useful to the test. That could take a while: Effectively, the NuGet package endeavors to retrieve the latest official MsSQL-server container image, as well as a Testcontainers-specific image that builds on the first-mentioned, as evident from a glance at Podman Desktop:

You’ll only take the ‘bandwidth-hit’ if those images are not already in your local image directory, i.e. subsequent tests will be much faster to crank up. Note the call we put into System.Diagnostics.Debug.Writeline(connectionString) that offers up the connection string in the output window:

When your test runs you can access the database from any tool you’re used to, such as SQL Server Management Studio:

As we run the integration test, Testcontainers will spin up containers from the images and – blink and you’ll miss it – run the test against it, and dismantle it afterwards:

And thus we arrive at the crux of the matter. The integration test that worked so well with SQLite seems now sadly perturbed:

As is proper, the MsSQL server throws an exception to let us know we’re about to truncate valuable data. This, for me, is the coup de grâce to integration tests that aren’t run against the same executable as runs on the environment that’s meant to be tested; we risk getting away with antics we should never allow ourselves to get away with.

Is this, then, the proper way to do integration tests? Not exclusively, no. Different organisations harbor different requirements, and we can only say with some certainty that the above is all things being equal better than testing against arbitrary database executables. Complexities lurk in the shadows: can you for certain say that your integration tests work as intended if they aren’t run against a data source with the same volume or throughput of data as the production database represents? And how does an integration test adopt the same security measures? Be prepared to venture beyond the simplicity of the above.

I’d be remiss if I did not demonstrate how to run these kinds of integration tests against a build server. Much akin local development, the build-server need only have access to an OCI-capable runtime. For my preferred DevOps environment of choice – Azure – it’s very easy indeed. Adding a dotnet test task to an otherwise utterly generic build pipeline is all there is to it:

This will, when executed on a publicly hosted Azure pipeline, automatically advance container-based integration tests to an internal Docker runtime. At first glance, we can only tell as much from the extended duration of the MsSQL integration – roughly half a minute’s worth, as opposed to the in-memory SQLite test which clocks in at around just half of a single second:

You could, for verification purposes, elect to download the full test report, and open it in Visual Studio or similar:

And from this report, we learn about the internal Docker use. It hardly gets any easier than that. 

We won’t always deal with public build agents; with internally hosted ones it should however suffice to add a Docker CLI runtime instead:

One caveat deserves a mention: Running one test in a dedicated container instance is trivial – but running several hundred or even thousand will be highly taxing to the pipeline, as it will in all probability struggle to set up and teardown the required resources. A better approach would be to refresh the database within the container, as opposed to the container entire. Below is a representation of an abstract base class from which individual integration tests can inherit to accomplish just that:

The static constructor initializes one test container, and the TestSetup() and TestCleanup() create ShopContext objects, instantiated with unique connection strings, for each individual test. Below is an example of usage:

Here we create an arbitrary number of customer-ids and insert them. The test class inherits from IntegrationTestBase, wherefore each test runs against a separate database within the same container instance. From the test output of the build pipeline, we can tell how only the first of the bunch took ‘the hit’ of having to fabricate the MsSQL instance – the subsequent tests piggy-backed off the hard work of the first one:

In conclusion, we made an integration test run successfully against a different database executable than the one we should have used. We then made the exact same test run against a proper MsSQL database running in a container and thus – and only thus – came across a malfunction our hitherto integration test hid from us. Let’s not dwell on the particulars; the sample code is solely meant to demonstrate potential pitfalls, nothing more, nothing less, and I’m certainly aware of the intricacies of validation.

The point is this, by running integration tests against the right kind of database we learned about an issue we might down the line be mightily sorry if we hadn’t discovered up front.

These are low-hanging fruits; used to be harder to pick, but with today’s tooling there are few excuses left.
Read more insights from Morten.

  1. Tech