3

Currently my company is storing the installers (and in some cases the installed directory copy) for some open source third party tools that our build uses. Were storing these files in our github repo which makes it very large and slow. I want to clean this up since the files themselves are not needed directly for the build.

I believe that the reason why these files were originally checked into github was because the developers must have feared that the third party tools might become unavailable in the future. For example a user just decides to delete their github account, where the releases are currently posted.

Now I realize the risk of this happening is infinitesimal, but at the same time some of the third party tools are pretty obscure. What would be a best practice to reasonably address this fear?

2 Answers2

4

It makes fully sense to store and keep installers of any third party tools you are using in your development process (not just your build process) in a location under your control. It does not matter is those tools are open source or closed source, when your software system is going to be used for a few years, expect some of your tools to vanish from the web during that time. Without keeping a copy of the installers, you may be screwed when you try to setup a new developers machine.

However, in case these tools are large binary files, Git is not well suited for this task. Instead, you could either use some shared network folder, or search through older questions on this site or stackoverflow which contain the keywords "git binary files".

Doc Brown
  • 199,015
  • 33
  • 367
  • 565
2

Consider storing the third party tool's source code and binary etc in its own separate Git repo, but still under your control. Then just download the built binary during builds. If both projects live in Github (or similar), the download should be fast.

Another option, if you're using Docker or Kubernetes images, is to add the built binary to an image. The build, storage, and dependency management systems for Docker/Kubernetes are optimized for binaries. I recall Amazon having its own storage system for versioned Docker image layers that played nicely with other build and deployment tools.

devdanke
  • 121
  • 4