69

When I think about the pros and cons of a static library folder and a package manager I feel like the library folder is a better approach.

Pros I see with a library folder:

  1. No need for an external tool to manage packages.
  2. No internet connection required to build.
  3. Faster build (no package checking).
  4. Simpler environment (less knowledge required).

Pros I see with a package manager:

  1. Helps with complex dependency trees (and that can be managed downloading a dependency together with all its dependencies).
  2. Helps checking if there is a new version available.

It seems the industry has decided to follow the package manager path for almost everything built today. So, what am I missing?

Ignacio Soler Garcia
  • 1,574
  • 2
  • 11
  • 17
  • 34
    I think you're really asking about the benefits of *bundling* vs. *requiring* libraries. You'll get more relevant answers if you use those terms. – Kilian Foth Jun 12 '18 at 09:50
  • @KilianFoth: I appreciate your comment. As I'm not a native English speaker I'm unsure about how to use these words (and their exact meaning in this context). Can you edit the question so it is more clear? I'm talking about using NuGet to get project dependencies vs having dependency dlls stored in a lib folder for example. – Ignacio Soler Garcia Jun 12 '18 at 09:53
  • Whether a build is faster without a package manager depends almost entirely on the implementation of that software. NPM is atrociously slow for no good reason, but Maven's resolution (if everything is already downloaded) is nearly instantaneous (10s of ms). – chrylis -cautiouslyoptimistic- Jun 12 '18 at 12:14
  • @chrylis: on cloud build agents (quite common today) nothing is already downloaded so there are scenarios where you will hit this penalty always. – Ignacio Soler Garcia Jun 12 '18 at 12:44
  • 4
    What's preventing you from creating a docker container for the build agent, containing everything needed and possibly not even allowing internet access at all (that shouldn't be needed for building)? I think you're more against learning a new tool than actually considering arguments. Have you ever worked on a large project that uses hand managed dependencies? It's not as great as you make it seem. – Kayaman Jun 12 '18 at 12:58
  • 3
    @Kayaman: learning a new tool costs team's time (money) and I would like to verify that we are investing it in the right thing. My experience with large projects is that dependencies are quite stable [they almost never change] and that's maybe why a package manager seems expensive to me. Anyway I was just listing pro's and con's after working a while with Nuget and spending some time with it. – Ignacio Soler Garcia Jun 12 '18 at 13:14
  • One upside of the approach of using a 'lib' folder: it makes your build continue to work even in the scenerio where the package repository gets shut down that the package manager needs to resolve your dependencies. Sure, that's a rare thing to happen, but in the long run, many internet services get shut down eventually (project hostings sites, etc.), so it could also happen there, too. Anyone has examples for that? – sebkur Jun 12 '18 at 14:33
  • @sebkur There was that time that half the internet got shut down because someone deleted a one-function utility package. https://qz.com/646467/how-one-programmer-broke-the-internet-by-deleting-a-tiny-piece-of-code/ – stannius Jun 12 '18 at 15:20
  • 2
    @sebkur You can keep local copies of all your packages. We keep the current versions of all our dependencies under local source control. The only time the package manager needs a connection is when we do upgrades. – 17 of 26 Jun 12 '18 at 15:38
  • 1
    @17of26: you don't learn how to configure a build agent to install nuget and run it on request in 10 minutes. Neither you do if you have a multi solution project where the same projects are used in different solutions. – Ignacio Soler Garcia Jun 12 '18 at 15:47
  • We've using Visual Studio 2017 and TFS 2017 for builds, which both have built in support for Nuget – 17 of 26 Jun 12 '18 at 16:10
  • @17of26 we're using Visual Studio 2017 and Visual Studio Team Services and we are building our solutions in Microsoft hosted agents so all the requirements have to be installed on every build. – Ignacio Soler Garcia Jun 12 '18 at 16:12
  • @IgnacioSolerGarcia if you are responsible for managing the build agent configuration, yes it may be more than 10 minutes to set up, but do all of your developers need to learn to manage the build agents individually? Even if so, it seems learning to manage NuGet would be a small part of the overall work of managing the agents. With _almost_ every project I've been involved with, default configurations for nuget have been sufficient... – Mr.Mindor Jun 12 '18 at 22:21
  • Your comment about a multi-solution project with projects shared across solutions does bring back painful memories of the one outlier where NuGet's configuration had to be modified. The project itself was organized poorly, and different solutions and different projects had different relative depths in the folder structure. By default NuGet expects/places `packages/` in the same folder as the current solution, so when a nuget package is installed via solutionA at `/my/big/project/solutionA/` this didn't play well when solutionB at `/my/big/project/SolutionB/` included the same project. – Mr.Mindor Jun 12 '18 at 22:37
  • 1
    @17of26 Good luck with binding redirects in 10 minutes ;) – BartoszKP Jun 13 '18 at 08:07
  • @Mr.Mindor: that's exactly what happened to me so that experience triggered this post as I was feeling too much pain for such a small benefit (handling 15 dependencies more or less). – Ignacio Soler Garcia Jun 13 '18 at 08:23
  • Does your lib/ folder check in headers, source code or binaries (or some combination of the three)? – dcorking Jun 13 '18 at 09:49
  • 1
    @dcorking: only binaries – Ignacio Soler Garcia Jun 13 '18 at 09:54
  • 1
    @Mr.Mindor Centralize your packages into a single directory. It won't hurt anything and is much simpler overall. You could group all the sln files in the same folder, or you could set up NuGet.config files that override the default packages location. (Or both with a shared NuGet.config file.) If your solutions are sharing csproj files anyway, it may not make a ton of sense to separate the projects. I agree VS has poor defaults for how it organizes folders when you're creating stuff; I always change the locations and move things around when creating new solutions. – jpmc26 Jun 13 '18 at 11:10
  • @jpmc26 Yeah resolved a long time ago now (initially with configuring NuGet to put packages up higher in the folder structure `/my/packages/`and ensuring same level of depth for the solution files). In this case VS could not be blamed for the project organization, it was a misguided company 'standard' that was forced upon us from above which that larger than average project helped give us justification to change. Whole point was there are some scenarios that you do need to spend more than 10 minutes figuring out how the package manager works. – Mr.Mindor Jun 13 '18 at 14:13
  • I think this question might want to mention the language being used. (for perhaps historical reasons) some languages many packages have lots of depencencies of the own. Julia for example is like that, installing TensorFlow.jl for example brings in about 2 dozen downstream dependencies, and DifferentialEquations.jl brings in something between 50 and 100. Because the development culture there is to make small packages that do one job simply and well, and then assemble them together for more complex tasks. Good luck trying to wrangle all those by hand, especially if you need an upstream bug fix. – Frames Catherine White Jun 14 '18 at 12:04
  • @LyndonWhite: yeah, for sure on these scenarios which are the way to go it makes sense to have a package manager. Talking about .Net here. – Ignacio Soler Garcia Jun 14 '18 at 13:08
  • 1
    @Ignacio did you know package managers create a lib folder? It's just called "packages" (at least in nuget). – RandomUs1r Jun 14 '18 at 22:44
  • @RandomUs1r: did you know that best practices say that packaged don't go into source code repositories while lib folders are always stored there? Did you know you don't need to be so condescending when talking in SO? Hatters do not need a reason boy. – Ignacio Soler Garcia Jun 15 '18 at 00:40
  • It is operating system specific. Some (weird) OSes don't have folders, and some don't even have *files*. – Basile Starynkevitch Jun 17 '18 at 07:19

9 Answers9

124

An important point missing from the other answers:

Using a package manager means having a configuration that indicates which library versions you are using and makes sure that config information is actually correct.

Knowing which libraries you use, and which version, is very important if you:

  • need to update a library due to a critical bug / security hole;
  • or just need to check whether an announced security hole affects you.

In addition, when you actually do update, the package manager (usually) makes sure any transitive dependencies are updated as required.

Whereas with a lib folder, you just have a bunch of (possibly binary, and possibly modified) files, and you'll have to guess where they came from and what version they are (or trust some README, which may or may not be correct).


To address your other points:

No need of external tool to manage packages.

True, but a) as a software developer you need to install loads of tools anyway, so one more does not usually matter, and b) usually there are only one or a few package managers in any given field (Maven/Gradle for Java, npm for JS/TypeScript, etc), so it's not like you need to install dozens of them.

No internet connection required to build.

All packages managers I know work off-line, once they have downloaded the required dependencies (which can happen right after downloading the project itself).

Faster build (no package checking).

Probably true, but it seems unlikely the offline package checking will take a significant amount of time (it's just comparing some version numbers). An online check may take a while, but that can be turned off if desired (if it is even on by default - Maven for example never checks for updates for release versions).

Simpler environments (less knowledge required).

True, but as explained above, a lib folder also requires knowledge. Also, as explained above, you'll probably only work with a handful differen package managers, which you'll know already.

sleske
  • 10,095
  • 3
  • 29
  • 44
  • 174
    “No need of external tool to manage packages” — yup. Now it’s your brain’s job. Good luck, brain! – Paul D. Waite Jun 12 '18 at 12:32
  • 58
    Anybody who seriously thinks that having a lib folder is an "easier environment" should simply go ahead and try and figure out the dependency tree for say the [Microsoft.AspNetCore.All nuget](https://www.nuget.org/packages/Microsoft.AspNetCore.All). Go on, don't wait for me I'll check back in about a day. – Voo Jun 12 '18 at 13:20
  • 15
    Also, the internet browser you use to manually hunt down libraries would count as an external tool to manage packages. Along with everything else you use (OS file manager, etc.) to prepare and position the libraries. So the real choice is one external tool (package manager) vs many. – NemesisX00 Jun 12 '18 at 16:47
  • 3
    Just FYI, I recently tried to work with gradle on an offline PC. No luck, Android Studio won't run my application and I get an obscure error message and that's after I did an initial run online for dependencies. It's only in these types of situations when you truly notice how dependent on package managers creating software has become... – FRob Jun 13 '18 at 14:36
  • I wonder why my objection/addition to @Voo was removed. Manually figuring out dependencies is not what is usually done in unpackaged environments. – phresnel Jun 13 '18 at 14:47
  • 9
    @PaulD.Waite While we're at it, we got rid of those pesky "languages" everyone is going on about. It all boils down to machine code eventually anyway, so at my firm we cut out the middle man. Now that's efficiency! – corsiKa Jun 13 '18 at 14:54
  • 1
    "Faster build" Yep... Until you break something and have to spend a few days figuring out which version of which library goes with which bit of code... – Perkins Jun 13 '18 at 22:33
  • +1 but there's also Ivy for Java. – David Conrad Jun 14 '18 at 05:46
  • @DavidConrad: Yes, but in my book "three" still counts as "a few" :-). – sleske Jun 14 '18 at 07:17
  • No argument there, just mentioning another one. – David Conrad Jun 14 '18 at 07:36
  • 2
    @FRob - FYI gradle has no problems with off-line mode, of course, but being off-line meant you had no way to check StackOverflow to figure out what you were doing wrong! And yes, I have actually used Gradle in off-line mode before, that's how I know it works. – industry7 Jun 14 '18 at 18:01
  • As a Scala developer who was recently throw (back) into Ant world for a legacy java project... yes, please don't use a lib bundle. – erip Jun 15 '18 at 10:45
40

Pros of lib folder disappear quickly after you move from small scale development to bigger work.

For example the "benefit" of not requiring an external tool is trumped by the work required to manually manage your dependencies, so the tool will be you (in more than one sense of the word).

You don't need an internet connection for a package manager. You can use local repositories.

Faster build might be true, but it's hardly something that should determine whether to use a package manager or not. We're not talking about magnitudes of difference after all, and this too depends on your configuration. You can easily make a slow build using a package manager, but that's basically sabotage.

Simpler environments (less knowledge required)? Again, in small scale development definitely a possibility. You might be able to hold the project in your head completely, down to each of the few libraries being used. Add a simple makefile / other build script and you've got yourself a complete package.

But it doesn't make environments simpler, it only works in simple environments. In larger scale development you'll be happy that you're using standard tooling instead of custom solutions. After all, you only have to learn it once (and when the package manager du jour is replaced by the new cool thing, you have to learn that too).

Kayaman
  • 1,930
  • 12
  • 16
  • Manually manage dependencies means to download a file and to add a reference to it on your project, easier than learning any package manager configuration tool unless you're talking about updating packages. Regarding an already built build process then for sure, it's already done :) The point is to avoid this work done. – Ignacio Soler Garcia Jun 12 '18 at 10:05
  • 22
    @IgnacioSolerGarcia: It's only easier if nothing goes wrong. What if the new version of library A needs an updated B and C, too? What if you it still runs, but introduces subtle bugs? That's all part of "managing dependencies". – sleske Jun 12 '18 at 10:39
  • 18
    @IgnacioSolerGarcia saying "download a file" doesn't quite paint the correct picture. Let's say "download 20 files and make sure their versions are compatible". No work avoided there. Updating packages isn't a theoretical situation either, unless you've decided to freeze the dependency versions and are ready to accept the possible issues (bugs, security flaws) that result from that. – Kayaman Jun 12 '18 at 10:51
  • 13
    @IgnacioSolerGarcia "download a file" - did you mean (1) find correct project website (some are hosted on github, some on sourceforge, some on their own websites), (2) go to download page, (3) find correct version, (4) download, (5) unzip and drop somewhere. That seems much more work than `blah install libfoo`. And then, multiply that by, say, 5 dependencies. – el.pescado - нет войне Jun 12 '18 at 12:19
  • 4
    @IgnacioSolerGarcia Only if your dependencies do not have dependencies that also have dependencies, which have dependencies... Have you ever seen dependency tree of the typical big project from any package manager? – user11153 Jun 12 '18 at 12:25
  • 5
    @IgnacioSolerGarcia Ok which files do I "simply" have to download to get [this nuget](https://www.nuget.org/packages/Microsoft.AspNetCore.All) correctly working? And that's basically the most trivial setup for a ASP.NET Core project. In practice you'll have many more dependencies. – Voo Jun 12 '18 at 13:23
  • @Voo: I guess in this case using a package manager makes sense. I never worked before with projects with 150 dependencies. They normally have 10 - 15 – Ignacio Soler Garcia Jun 12 '18 at 13:31
  • 6
    @Ignacio That's the basic meta nuget to get the simplest ASP.Net Core application up and running. True in the good old days of the full framework the whole thing was "easier", because you just got large monolithic libraries that were all versioned at the same time and releasing an update took you years. That model got basically deprecated for very good reasons not just in the .NET world. You'll have to get used to the whole model of lots of small libraries doing one particular thing and honestly using a package manager is the easiest part of the transition. – Voo Jun 12 '18 at 13:57
  • 5
    Same goes for npm. Pretty much every web project I work with uses webpack. Sure I may not directly use all 319 packages installed as a result of installing webpack, but I can't imagine the alternative of `npm i webpack` – aaaaaa Jun 12 '18 at 21:45
  • 1
    @Voo: In practice, such big projects would come with helper scripts and BUILD.txt's. – phresnel Jun 13 '18 at 14:48
  • 2
    @phresnel And then what? I still have to manually install 100s of assemblies in all my projects and go through all references whenever I want to update them. That sounds like a lot of effort and very error prone, so I guess it's better to create a script that updates all references. But if it can already update all references it should probably also be able to simply install them. So if we're at that, why not make it so that it also automatically downloads the version from some website and then installs them? – Voo Jun 13 '18 at 15:14
  • 5
    Sounds like a pretty swell idea admittedly. But also lots of effort, so why should everyone write their own script with slightly different options and host it themselves, what about if we, say created a standard tool that would do those tasks for us? So yeah, we just reinvented package managers. I mean those old days of weird, half-working build scripts and BUILD.txt's is exactly the reason why we moved on to standardized package managers. – Voo Jun 13 '18 at 15:15
  • 1
    @Voo: I only wanted to point out that there is a huge difference between "figuring out" and "following the manual". I.e., there's not only "white" (Nuget, NPM, APT, RPM, etc.) and "black" (undocumented tarballs, zips, source repos). – phresnel Jun 13 '18 at 16:02
  • 1
    @phresnel The linked NuGet package documents its dependencies perfectly (otherwise it wouldn't work), but my point is that following all those transitive connections and building the graph is not exactly trivial and takes lots of time. In the best case your manual solution would be as good as the "do manually using NuGet package" and even that would take ages. – Voo Jun 13 '18 at 16:15
  • @Kayaman I have one thing to add: if your project becomes "bigger work", you would simply have multiple small projects that form the big project. if you're doing it differently, you're doing it wrong. So, if you're not on google scale, a package manager is not that important – Binar Web Jun 16 '18 at 06:54
  • 1
    @BinarWeb I strongly disagree. Dividing a bigger project into smaller modules is an architectural decision, whether you're manually handling the dependencies of a monolithic project or each of the sub projects, you'll still be doing the same amount of work, and now with plenty of overlap. The manager title may seem enticing, but you're a developer, not a package manager. – Kayaman Jun 18 '18 at 06:16
36

You're missing many of the advantages of package managers.

  1. Package managers allow you to avoid checking large (several megabyte or larger) binaries into source control. Doing so is anathema to many source control tools, the pervasive git being one of them. We had a repository hit Bit Bucket's size limits a couple months ago because developers were checking in CocoaPods. Another project was already halfway there when we migrated from SVN because we had been checking in all our lib binaries (and hadn't been using NuGet). Since package managers will download packages on the fly for each developer, they eliminate the need to check these binaries in.
  2. They prevent mixing incompatible files/libraries. Folders can contain mixed versions of the library's files if someone doesn't clean it out properly during an upgrade. I've seen a case where half the binaries in the folder were upgraded, resulting in very weird bugs. (It didn't even crash!) It took us literally months (not man hours, just overall time) to figure out the problem. By letting the package manager control the entire folder, you can't get mixed versions; you ensure consistency. They also make it much harder to use incompatible dependencies, by automatically updating everything together, installing different versions where needed, or even just throwing warnings or errors when attempting to use incompatible versions of libraries.
  3. They establish a shared convention for managing libraries. When new developers come onto your project, team, company, etc., they're likely to know the conventions of the package manager. This means they don't have to waste time figuring out the fine details of how libraries are managed in your code base.
  4. They give you a standard way of versioning and distributing your own dependencies and files that don't belong in your repository. I have even personally leveraged them for some large static data files my application required, so it works well for versioning things besides binary code.
  5. Some package managers provide additional features during installation. NuGet will add dependencies and content files to your csproj file and can even add configuration elements to the config file.
  6. Their package list files document the versions of your libraries in a single, centralized place. I don't have to right click a DLL and look at the version number to figure out what version of the library I'm using. In Python, the library author may not have even included the version number in the py files, so I might not even be able to tell the library version from them.
  7. They discourage machine wide installation of dependencies. Package managers provide a conventional way of installing dependencies without a global installer. When your options are lib folder and global installation, many library developers will choose to offer their libraries primary as global installations rather than as downloadable binaries that you have to set up yourself. (MS history demonstrates this. It's also the case for many libraries in Linux.) This actually makes managing multiple projects more difficult, since you may have projects with conflicting versions and some developers will certainly choose the seemingly simpler global install over having their own lib dir.
  8. They tend to centralize hosting and distribution. You no longer have to depend on that random library's web site. If they go out of business, a successful package manager's site still has every version ever uploaded. Developers also don't have to hunt down many web sites just to download new libraries; they have a go-to place to look first and even browse for different options. It's also easier to mirror packages organized in a standard way than to manually host copies of everything from ad-hoc websites.

You're also overstating the value of your "benefits."

  1. No need of external tool to manage packages.

    "External" to what? I check the NuGet executable into my repositories. It's the only binary I feel okay checking in, since it's small and means I don't need to check any other binaries in.

    pip doesn't pose problems on this front since it's bundled with Python by default now and breaking and backwards incompatible changes are exceedingly rare. You're not going to develop Python code without having Python installed externally to your project, anyway.

    By the time they reach widespread adoption, package managers tend to be very stable. You can't get away without some kind of globally installed tooling for most projects, and a single package manager is a pretty light weight requirement. It's usually not much more cumbersome than having the language runtime installed.

  2. No internet connection required to build.

    I can't connect to my database without a network connection. If the database is Amazon hosted, I need a full internet connection anyway. I need an Internet connection to push and pull changes through source control; a build server can't check out code to build without some kind of network connection either. You can't send or receive e-mail without one. You can't download libraries to put them in your lib folder without one! Permanently developing without an internet connection is virtually unheard of. In some rare cases where it's necessary, you can deal with this by downloading the packages to a location that the package manager can consume. (I know NuGet and pip are quite happy to pull from a simple folder or network drive; I suspect most others can as well.)

  3. Faster build (no package checking).

    30 seconds during the automated build and 5 seconds during local dev builds are a good trade off for the benefits I outlined above. These are trivial time frames that are usually not even worth considering in comparison to the problems the benefits solve.

  4. Simpler environments (less knowledge required).

    One tool for package management against nothing for library management isn't really a fair comparison, anyway. Without the tool, you have to learn whatever custom process the project is using to manage its libraries. This means you're never sure whether your existing knowledge applies to any new project you approach. You'll have to deal with whatever hodgepodge approach someone came up with, or make up your own. That might be a directory containing all the libraries, or it might be something much weirder. Maybe to avoid checking the libraries in, someone put them all out on a network drive and the only version indicator is the folder name. How is that or a global install really any better? By comparison, a package manager gives you a clean convention that will apply across most projects you encounter.

The common theme is that they provide consistency, documentation, and features not only within projects, but even across them. This simplifies everyone's life.

jpmc26
  • 5,389
  • 4
  • 25
  • 37
  • Well, I would love to respond to many points but this is too short. Summarizing, regarding 1 & 4 they can be easily solved by convention and regarding 2 that can be learned in 5 minutes with a document. I agree with some of your other points too, specially 5, 6 & 7. Regarding simpler environments people in this post tends to oversimplify the knowledge required. Talking about Nuget, it's UI, it's commandline options, the structure between solutions and projects, how to deploy the tool on build agents, how to create local repositories ... that's a nice amount of time ... – Ignacio Soler Garcia Jun 13 '18 at 08:32
  • 10
    "Permanently developing without an internet connection is virtually unheard of" I wish I didn't know better. There's lots of development done in completely separated networks due to security reasons. Yes it is about as much fun as it sounds, but it is absolutely doable. You just have to set up your own infrastructure for package storage (i.e. your own nuget feed). – Voo Jun 13 '18 at 09:26
  • 1
    Although having your own infrastructure is actually one of the few things that makes sense in any case: You don't want to be reliable on external infrastructure. If that one is not available for one reason or another it is much better to have a fallback that guarantees that your developers can continue to develop. (And before anyone tells me how that nuget.org or npm or would never ever have such troubles, [maybe think again](https://qz.com/646467/how-one-programmer-broke-the-internet-by-deleting-a-tiny-piece-of-code/).) – Voo Jun 13 '18 at 09:29
  • 3
    @IgnacioSolerGarcia Establishing a per project or per department or per company convention is not better than just having a convention everyone knows without being told. Additionally, the package manage does a better job of *enforcing* the convention, since it makes following the convention less work than breaking it. Also, as I mentioned, I commit NuGet directly and invoke it in the build script, so I don't need to have it installed. I keep build server installations to a truly bare minimum. – jpmc26 Jun 13 '18 at 09:58
  • @IgnacioSolerGarcia Creating a local repository is literally as simple as dropping files in a folder and dropping that path in a config file. I've done it with a network drive. Sure, you can get fancy and have a full blown hosted server if you want, but you don't have to if you need something quick and dirty or can't convince someone to spend the time/money on the hardware. – jpmc26 Jun 13 '18 at 10:00
  • @Voo Keyword being "virtually." ;) I'm sure it does happen, but it's far from typical. Yes, I'm aware of left-pad. You're also aware that npm's maintainers stepped in, right? We could debate the ethics of their actions, but they clearly have a vested interest in keeping their system running and will take steps to mitigate bad actors. If you're Facebook or you're in an isolated network, sure, run a mirror. But for most dev shops, the risk is low enough, especially since most shops don't do daily releases or anything. It's a heck of a lot lower than someone screwing up when doing stuff manually. – jpmc26 Jun 13 '18 at 10:07
  • @jpmc26 Oh it was just one example. `nuget.org` was also down here for a few hours a while ago. You don't need a complete mirror, just a cache that keeps packages you already downloaded once locally. That way your builds will still work, which is nice if your whole dev setup relies on gated checkins. And you probably need a package repository anyhow for your own packages if they're not all public. As soon as you have that even the argument that you need the internet for a package manager goes away. – Voo Jun 13 '18 at 10:28
  • 2
    @jpmc26 Imho your first numbered list would benefit from some **emphasis**. – Søren D. Ptæus Jun 13 '18 at 13:56
  • Another case where your test environment's database server, version control server, and build server are all on a LAN is a shop in a rural area or developing country, where an Internet connection costs several dollars per GB. Often it's cheaper and faster to drive with a laptop to a hotspot in town to fetch updated packages for your local cache. – Damian Yerrick Jun 14 '18 at 17:31
  • I'm going to accept your answer if there is nothing new before Sunday. At least yours seems the best one to me. – Ignacio Soler Garcia Jun 15 '18 at 20:40
  • 1
    @SørenD.Ptæus Done. – jpmc26 Jun 23 '18 at 19:53
16

Having recently converted our product from using manually downloaded libraries to automatic package management with Nuget, I can say that using a package manager has massive benefits.

Our product is implemented across 27 C# projects, which is relatively small by today's standards. Some of our third party dependencies have dozens of assemblies.

Prior to Nuget, if I wanted to update all of our dependencies to the latest version I would have to:

  1. Track down where I could get all of the updated libraries
  2. Download them and unzip/install
  3. Add the new versions to source control
  4. Manually look through all references in our projects and update them to point to the new assemblies

With 27 projects and dozens of dependency assemblies, this process was very error prone and could take hours.

Now that we've updated to using Nuget, it's all done for me with a single command.

17 of 26
  • 4,818
  • 1
  • 21
  • 25
  • Agree, that's the point 2 of the pro's. Anyway changing dependencies is something we rarely do (probably because of lacking proper automates regression tests). – Ignacio Soler Garcia Jun 12 '18 at 12:45
  • 9
    Updating dependencies is something that's a lot less painful to do if you do it regularly. – 17 of 26 Jun 12 '18 at 12:47
  • We require complete regression tests (expensive) on every dependency update, otherwise we cannot warrant that we still work properly. – Ignacio Soler Garcia Jun 12 '18 at 12:50
  • 1
    Are these tests automated? Exactly how long do they take to run? Even if it takes 24 hours to run the full suite of tests, that still allows you to update dependencies every few days with little downside (even though you probably wouldn't do it quite that often in practice). Even if they're manual and unavoidable, using manual installation you could spend days running through tests only to find out they fail because you missed some dependency of a dependency, then you have to start over again after installing it, which wouldn't happen using package management... – Sean Burton Jun 12 '18 at 13:19
  • 3
    Do you not require regression tests on new software releases? Just update dependencies when you're already doing testing for a release. – 17 of 26 Jun 12 '18 at 13:38
  • And that's how it was in the good old days. I for one doesn't miss it. – Thorbjørn Ravn Andersen Jun 12 '18 at 23:10
  • @17of26: we do not do complete regressions on every release, we don't have them fully automated and the tool is too big to do it (it could take months testing it or automating it). With all that automated then maybe it's easier to upgrade dependencies but anyway I don't see why I would do that other than to solve a bug. When I select a third party it already does what I need, otherwise I select a different one. – Ignacio Soler Garcia Jun 13 '18 at 08:35
  • 4
    *"We don't have them fully automated and the tool is too big to do it (it could take months testing it or automating it)"* - there's your big problem. These tests should have been in place since the start. Your problem isn't that using package managers provides no benefits, your problem is that the context you're working in is too broken in other ways to allow you to enjoy them. – Ant P Jun 13 '18 at 13:56
  • *"With all that automated then maybe it's easier to upgrade dependencies but anyway I don't see why I would do that other than to solve a bug"* Your dependencies have security vulnerabilities. I don't even need to know what your dependencies are, but I guarantee they have security issues especially if you haven't been updating them. – Wesley Wiser Jun 14 '18 at 19:12
14

No need of external tool to manage packages

That's kind of a non-point is it? If I use a package manager, I don't need to have a lib folder. I also don't have to manage the packages myself.

No internet connection required to build

Aside from that not having an internet connection today while developing is somewhat rare (maybe with the exception of being in transit), a decent package manager shouldn't require you to have to have the latest version to build your application. It might complain, but there's no reason to not build with the version that it already installed

Faster build (no package checking)

That's a pretty marginal speedboost, but you can arguably make a point for that.

Simpler environments (less knowledge required)

Most package managers are so simple these days that it's hardly worth trying to get around them by doing these. There's even visual clients if you want to. They actually hide a lot of the croft that is going on.

Package managers also allow you to share these packages between different projects. If 5 of my projects use the same version of Boost, there's no need to duplicate this for each project. This is especially true for the complex dependency trees you speak of.

With a lib folder you manage packages only for that project, while a package manager allows you to do this for your entire development environment with a single tool.

Athos vk
  • 430
  • 2
  • 7
  • It's not so easy to have a build agent configured to install a package manager during a build, to restore dependencies and so on. Nothing needed with a lib folder. – Ignacio Soler Garcia Jun 12 '18 at 10:03
  • 4
    I think that depends what language/s you are using. With languages like Ruby or Rust the package-management is so well integrated that using it is entirely trivial. – Sean Burton Jun 12 '18 at 11:00
  • Well I ommited that on purpose to have broader comments but I'm talking specifically about NuGet, C# and VSTS cloud. – Ignacio Soler Garcia Jun 12 '18 at 12:48
  • 4
    @Ignacio Whatever build system you're using that doesn't make it utterly trivial to restore NuGets should be immediately thrown out. Luckily VSTS makes this about as easy as it gets ([documentation](https://docs.microsoft.com/en-us/vsts/pipelines/packages/nuget-restore?view=vsts)): There's a NuGet restore task that you point to your solution file and tell it what NuGet sources to use - for a simple project simply using `nuget.org` will do just fine (the default template should already be set up in this way). – Voo Jun 12 '18 at 13:29
  • @SeanBurton Ruby, specifically RVM, was a nightmare in long term multi-project environments. Most things are, mind you, but in long term projects, even simple tools incur non-trivial costs over time. – Ben Jun 13 '18 at 02:26
  • 3
    @Ben RVM is not a package manager. The package manager for Ruby is RubyGems. RVM manages versions of Ruby itself, and for that rbenv is better... – Sean Burton Jun 13 '18 at 07:35
  • @SeanBurton I am aware of that, sorry if my sentence wasn't clear. I mean for package management Ruby was no fun, but wanted to add that RVM made it even more painful for the package manager to do it's job. – Ben Jun 14 '18 at 01:30
  • For freelancers who work on projects while riding the bus to and from their day job, this "exception of being in transit" is the normal case. – Damian Yerrick Jun 14 '18 at 17:33
  • @DamianYerrick Sure but once again, the point was that even without internet connection, any decent package manager will still allow you to build your project – Athos vk Jun 14 '18 at 18:18
5

It is the difference between just using libraries (lib directory), and using them, maintaining meta-information (package manager). Such meta-information concerns version numbers, (transitive) dependencies between libraries and such.

The discussions of DLL hell, library compatibility, java module system, OSGi and such should at least be sufficient convincing of some worth of having some form of dependency management.

  • Library version and dependency problems can be a wast of time.

Also there is the benefit of a shared (local) repository, so several project do not need to maintain copies of imported libs. If one has a project with 20 submodules, some of those modules having 40 odd dependencies.

  • More structure
  • More trimming of libraries
  • No ad-hoc human decisions about libraries
Joop Eggen
  • 2,011
  • 12
  • 10
3

There are some cases where a lib folder might be necessary, for example when dealing with obsolete libraries (a version of it no longer maintained/available), a locally modified version of a library, ...

But for everything else, it's like the developer assuming the role of the package manager:

  • The developer will have to download the libraries (internet required)
  • The developer will have to check for newer releases manually
  • ...

And IMHO, it's less knowledge required, because you have to learn about the usage of the external tool, but less about the libraries (i.e. dependencies).

FranMowinckel
  • 215
  • 1
  • 5
  • 4
    Even for obsolete or modified libraries, all package managers that I've seen so far offer the option to upload local dependencies to your local repository. But ok, that is where you loose some of the "it just works automatically" experience. – Hulk Jun 12 '18 at 12:19
  • @Hulk if it's an open source library then you can (and should, probably) just publish your version and thus make it visible to the package manager. Either by pushing the modifications to the maintainers, or by bringing out your own fork of the library. – leftaroundabout Jun 13 '18 at 11:36
  • If you have modified a library whose maintainer is unresponsive to patch mail, it then becomes an issue of figuring out how to configure the package manager such that other packages that depend on the library can also be satisfied with your modified library. – Damian Yerrick Jun 14 '18 at 17:36
1

There is another problem not covered by other questions: sharing deps.

Let's say, you have two packages building same library. In best case, there will not be any coflicts, but same HDD/SSD space used twice. In worst one, there will be varios conflicts, like versions.

If you use package manager, it will install library only once (per version) and already provide path to it.

P.S.: of course, you need dynamic linkage (or similar function in your language) to get this pro.

-5

One major reason why shared libraries were considered an item of progress in 90's era Unix and Windows systems was how they could lower RAM usage when multiple programs using the same set of libraries were loaded. Code space only needs to be allocated ONCE per exact library and version of that library, the only per-instance memory usage remaining is for static variables.

Many operating systems implement shared libraries in a way that depends on mechanisms like the unix mmap() api - which implies that a library will not only need to be the exact same version but actually the very same FILE. This is simply impossible to take full advantage of for a program that ships its own set of libraries.

Given that memory is far cheaper, and library versions needed more diverse, than in the 90s, this argument doesn't carry as much weight today.