6

I was wondering what the best practice is regarding databases for integration tests on the build server. If there is a best practice.

Currently, our build will create a new database from scratch for every build. Integration tests are then performed against this.

However, as the project has a long history, and continues to grow, the amount of migration scripts (using EF Code First migrations) increases. This is starting to slow down the build.

One solution would be to no longer recreate the database for every build. A possible down-side of that is that you could have to manually revert certain changes that were pushed to the (Git) repository but were deemed wrong afterwards (or even fail the build).

Another solution is to restore a backup of a database at a known point and perform any subsequent migrations every time. All we need to do then is update the reference point regularly. But this, again, is manual work.

Actually, this is how we're currently doing it, but I wondered if there were any other or better strategies.

Peter
  • 998
  • 8
  • 20
  • can you carry an .mdf file (assuming SQL Server) as an asset with your integration test data and attach that? You'd have to maintain the .mdf db accordingly but it'd help with the build speed (not much different than you're currently doing, but a bit more isolated) – jleach Oct 27 '16 at 10:02

1 Answers1

2

Take the current version of the database as the new start point. When it gets changed later on, your test setup will run the migration scripts from that start point only. That will save some time while not requiring too many manual steps.

Alternatively, time consuming tests may run with nightly builds only. Feedback for those tests will be slow, but the "normal" build and test will stay fast. Look into your test failure statistics to find out what is likely more important.

Bernhard Hiller
  • 1,953
  • 1
  • 12
  • 17