2

Had an interesting discussion with our architect. It was related to replacing a plain DLL reference with a NuGet package. His worry was "If it is possible for single NuGet package to add multiple DLL references, then NuGet package authors can decide to add a new DLL to the package. So when we update the NuGet package a new DLL is added, but our installer won't know about it, so it won't include it in the installation. And this problem will be revealed only when testers get to test the installed build of the software". And this would be reason not to use NuGet, as using plain binary references would make things obvious when new DLL was added.

My stance on this problem is that the chance of this happening is too small to bother with. That NuGet package authors would consider this a breaking change and only make this change in a major release. And that mitigating this risk should not be by not using NuGet, but by creating test automation that stresses our installed software.

The question is : What is risk of the above happening? Of a NuGet package adding new DLL as a project reference in non-major release?

Euphoric
  • 36,735
  • 6
  • 78
  • 110
  • 1
    Does this NuGet package come from one of your own NuGet servers, developed by parts of your own team or a different team within your organization, or are you thinking of NuGet references to some free software developed somewhere on the internet, where you don't have the development under control? – Doc Brown Jan 23 '20 at 06:50
  • Primarily public package. It was Microsoft package this time. But it is good point that who makes the package can have influence on this risk. But how big? – Euphoric Jan 23 '20 at 06:52
  • The risk depends on how you create your installer. Many automatically detect dependencies from NuGet packages. It’s actually _riskier_ to use raw copying of dlls IMO. – RubberDuck Jan 25 '20 at 22:26
  • @RubberDuck Oh? I've never seen such installer. Can you point me to one that does it and isn't OneClick-tier of bad? – Euphoric Jan 25 '20 at 23:37
  • 1
    We’re using Wax, which makes Wix easier to use @Euphoric – RubberDuck Jan 25 '20 at 23:38

2 Answers2

2

Whenever you include 3rd party components in your own software system, updates for those components can break your system, whether those updates include a new DLL or not.

This happens even if the authors of those components act in good faith and try to be backwards compatible within one version line. It is not a hypothetical situation - software has bugs, and sometimes those bugs only reveal themselves in your specific environment.

So I would heavily recommend against configuring your build process to download always automatically the "latest and the greatest" version of 3rd party components somewhere from the internet and let them add updates to your system in an uncontrolled manner. If you want to use NuGet in a safe way, you have to make sure it downloads always a specific version of the 3rd party component.

If you want to update to a newer version, someone of your team

  • actively sets a switch which version this will be

  • initiates a suitable QA process (ideally with lots of test automation, of course, but also some reviews, and some information to all which might be affected by the update)

I am not an expert on NuGet, but AFAIK the safest way of doing this is by setting up your own, private NuGet server and providing exactly the versions of the 3rd party components you are going to use for your system, nothing else.

To some degree you have to rely on 3rd party software and their updates, like the .Net framework from Microsoft itself (especially since Microsoft started to deploy in-place updates to the 4.x version line directly on the end users systems). But I would not conclude that this is also a good approach for arbitrary 3rd party components.

Hence, whenever one is able to keep the updates of components under one's own control, I would recommend to make it so.

Doc Brown
  • 199,015
  • 33
  • 367
  • 565
  • Thanks for insights. My problem was less about NuGet grabbing the latest version (it doesn't, updating always requires explicit command). The problem is mainly about developers updating to new version, and not noticing new DLL was added. My concerns is primarily on what the probability of such a risk is. I agree that best mitigation is to have automated tests, but our current project doesn't have those (and probably never will). – Euphoric Jan 23 '20 at 08:12
  • "setting up your own, private NuGet server and providing exactly the versions of the 3rd party components you are going to use for your system, nothing else." +9000 – Ewan Jan 23 '20 at 10:54
  • @Euphoric: if you any dev in your team can update any 3rd party component **without following a strict QA policy**, then you have a far bigger problem than just forgotten DLLs. And this has nothing to do with NuGet. – Doc Brown Jan 23 '20 at 20:11
0

What is risk of the above happening?

The probability certainly is not 0.

However, it also goes somewhat deeper in that any of the dependencies in the dependency hierarchy may add references/packages. Even when this does happen it may not require a major release if the functionality in the contained package is not a breaking change.

The best, as you have stated, is to mitigate the risk of missed assemblies by performing some sanity checks on installed/deployed software.

Eben Roux
  • 216
  • 1
  • 3
  • If a package requires another package, NuGet automatically adds this to your packages.config or (packages.json?) file for .NET Core projects. The dependency hierarchy should be obvious from those files. – Greg Burghardt Jan 23 '20 at 23:39
  • @GregBurghardt: Agreed. I'm making the point that an assembly may be added at any level. Nuget will resolve them but if a build/deployment process does not have a mechanism to deal with arbitrarily added assemblies it may still lead to problems. The hierarchy, when inspected, should be obvious. – Eben Roux Jan 24 '20 at 03:45