4

I am looking for advice on how to manage shared code between several projects more effectively.

Currently, we have several applications that use the same common backbone of infrastructure code, from simple utilities to wrappers around larger libraries, offering our preferred ideal interfaces internally.

Currently, we have tried and used a mixture of solutions for distributing this among projects, from single large repositories to Nuget packages.

When visiting this topic late last year, the community had warned me off using Nuget as a method, and I would be interested in understanding what best-proposed methods are.

We are currently developing in C#, across a mixture of Core 3.1, framework 4.62, etc. We use TFS 2017 for version control.

At the moment, I'm considering Subtrees and Sub Modules, but I would be really keen on what people have found most effective from their experiences.

Edits

Based on feedback, as @GregBurghardt has mentioned, I am looking for more conceptual solutions or how it has worked for others rather than a specific tool for a job. I.e. what works for your organisation. What are the tradeoffs of your workflow?

I will admit that I was torn whether the question sits well here or on Stack Overflow. As I felt it was more conceptual or opinion-based, I felt it would be more home here ((apologies if it's in the wrong place!)).

As for Nuget @DavidArno, On TFS 2017, we have set up a local NuGet repository, which is being used by other projects. I migrated my project into this environment, but after 'living' with Nuget workflow, it raised numerous issues. The main one is agility, whereby you want to extend or expose shared functionality quickly, you are pushed into a roundtrip of rebuilding, pushing and updating packages. At a low point, I found myself updating packages 10-20 times in a day, at a high point dreading the need to do it once in the day.

When discussing the issues I found with my workflow, the main prevailing opinion was 'that's not what Nuget is for' and don't use Nuget to distribute libraries internally.

An opinion, I'm inclined to agree now. After building a pipeline that would build, push and update packages seamlessly -the time overhead was becoming exorbitant. Full rebuilds of my project to debug something was pushing over an hour, vs sub 10mins in a single solution. Visual Studio and Rider would often throw a 'wobbly' at frequent package churn and updates in the isolated period.

I appreciate it raises wider questions over coupling, architecture - but given an example of something like a reflection library to help and assist with common jobs, there are times when you are growing the library's functional base in tandem with the task you're doing or discovering a more performant way of solving a problem in your library.

I don't believe development workflow should get in the way or deter from this process.

Obviously, there will be onlookers who have mature library x or process y. In those situations, Nuget is probably applicable. But for those who are facing fragmented codebases, duplication of common functionality between projects are likely to be in my situation where I want to bring about better unification.

Doc Brown
  • 199,015
  • 33
  • 367
  • 565
  • 5
    "When visiting this topic late last year, the community had warned me off using Nuget as a method". I'm unsure why that was. I'd expect the top recommendation for you to be to use an private nuget repository within your organisation. One way of doing that would be to migrate away from your ancient TFS 2017 over to Azure DevOps as it has nuget repo support built-in (along with many many enhancements over TFS). – David Arno May 07 '21 at 13:53
  • 3
    To those voting to close this question because it is asking for tools or recommendations - I don't believe this is the case. The OP appears to be looking for more conceptual solutions to managing dependencies and libraries. This should be right up our alley. – Greg Burghardt May 07 '21 at 14:11
  • @GregBurghardt, thank you and correct; I am looking for more conceptual solutions or how it has worked for others. I will admit that I have torn whether the question sits well here or on Stack Overflow. As in my reasoning, I felt it was more conceptual or opinion-based; I felt it would be more at home. – Gregory William Bryant May 07 '21 at 14:23
  • @DavidArno, On TFS 2017 we have set up a local NuGet repository, which is being used by other projects. I migrated my project into this environment, but after 'living' with Nuget workflow it raised numerous issues. The main one is agility, whereby you want to quickly extend or expose shared functionality and pushed into a roundtrip of rebuilding pushing and updating packages. At a low point, I found myself updating packages 10-20 times in a day, at a high point dreading the need to do it once in the day. – Gregory William Bryant May 07 '21 at 14:27
  • @GregoryWilliamBryant Would it help if developers used a machine-local NuGet feed for debug/development purposes? i.e. so that you can pull down library code changes from git and rebuild to develop and test against locally, without needing to update via NuGet -- See: https://docs.microsoft.com/en-us/nuget/hosting-packages/local-feeds - the idea being that Visual Studio locally would pull packages from the local feed instead of your team's shared feed. (Local projects may need to point to a fixed version number though, so it's not entirely hassle-free) – Ben Cottrell May 07 '21 at 14:41
  • @BenCottrell This is an effective pipeline I created, with the exception that the local feed was also dropping the packages on remote location as well. The main issues for me were that with the number of packages I was working with (around 20-30 local packages) the idea would get paralysed working up the updated graph. – Gregory William Bryant May 07 '21 at 14:54
  • Further, the next issue would be the manual processes required to force IDE to ensure it had the latest packages on the build. It resulted in a human dependent process during development. At one point, it was suggested to hard reference the Nuget Projects 'during development' and then kill off the references to packages once they are stable. This, to me, undermined the development process I was looking for and added further human processes. – Gregory William Bryant May 07 '21 at 14:55
  • Why does this single library develop with several different applications? – Thorbjørn Ravn Andersen May 09 '21 at 12:10
  • 1
    This is starting to look like a blog post. If you have comments, please post them in the comments section, not your question. Users *do not get notified* when you @user them in a question. – Robert Harvey May 09 '21 at 17:58
  • @RobertHarvey Thank you for the meta steer, I felt that some of the responses to the comments also aided in re-contextualising the question, although I will keep that feedback in mind in the future. – Gregory William Bryant May 10 '21 at 08:16

1 Answers1

6

Not long ago, I wrote this answer which scetches the typical process for library reuse among different projects. From that answer, it should be clear, what you experience is

  • mainly a process issue, not a tool issue

  • "Nuget" in your question is just a synonym for "central-server based solution for distributing published releases of a library to other projects" (and using another tool, like some different artifacts repo, or a file server with a lot of home-grown scripts, will most probably bring you the same issues, when you don't change your process)

The root cause for your experience is that you have two opposing goals:

  1. On one hand, your projects want stable and ideally precompiled libs - when you add a dependency from "your" project to a lib, you don't want to get several breaking changes each week because of requirements coming from other projects into that lib, and you don't want to compile the lib for your "own" project more often than necessary because of changes you don't need.

  2. On the other hand, your project may want to develop the lib further and push changes into it in an agile manner. You may also want to get bug fixes and improvement from other projects which may be useful for yours.

Both goals have obviously their justification. Nuget is mainly aiming at the first goal, while a "single repo" (maybe together with"Git submodules") aims more for the second.

So how to get these two goals together? IMHO the right approach is to differentiate for each library when you are in case 1, when in case 2 and make this explicit in the build process by either

  • publish and reuse the library via Nuget

  • or, as long as you want to develop a library in a more agile way, have the lib directly in your project's repo (or maybe as a submodule)

The latter could mean

  • you have a main project where the lib lives "primarily", and where it is under active, agile development. For other projects, stable releases of the lib are published from time to time to the Nuget server.

  • or, you have a second project where the lib shall also be actively developed: then you could use the submodule approach, and make sure you have a process which will merge any changes made there from time to time back to the main line of the lib.

  • or, the only "main" project of the library is a project with tests for that lib. Whenever a "using" projects want something to be changed, they have to contribute a test case first, which will added to the test project, used to develop the feature or bug fix for the lib and leads to a new stable release.

You can also switch between the approaches when it makes sense, and use them in parallel depending on what you want for each of your projects. The same lib can be used directly in one project, and by Nuget in another, and it can be referenced directly, and later with Nuget, or vice versa

Doc Brown
  • 199,015
  • 33
  • 367
  • 565
  • you have succinctly described my opposing goals and have correctly identified my problem case (2). Effectively it's a process problem rather than a tooling one with no magic answer! – Gregory William Bryant May 10 '21 at 08:26