2

Suppose I'm working on software to be installed on a Linux machine (not mine).

Say the software consists of applications the user might start independently; an application which will likely be started on desktop environment login, a service which should be started when the system boots; and libraries, some shared among multiple apps and some separate.

Now suppose that each of the above is maintained as its own git repository; and that I build everything with CMake, looking for the dependencies as already-built-and-installed software, nor as additional source directories. I also don't use git submodules (At least - I don't right now.)

None of the projects has code that takes care of retrieving everything, building everything, and triggering an install of everything - plus there are tasks like creating an /etc/init.d service (or a systemd service if you lean that way). And so, what I currently do is have a simple shell script for doing all of these things. A sort of an installer or installation-orchestrator.

I don't like this, because:

  1. I lied, the script doesn't really exist yet, there are partial scripts that do some of this stuff, and the rest is done by hand.
  2. No uninstallation/cleanup script - unless I write that one too.
  3. I want to be able to check versions of what I've installed - without checking versions on all the files
  4. I bet there are generic deployment mechanisms I could be using, and I just don't know them, and I'll feel like a fool reinventing the wheel with my own scripts.
  5. I want this installation to be automagically updatable when something changes in the repositories' CMakeLists.txt, without having to manually propagate changes to my installation script.

My question: How can I better orchestrate my installation? Is there some better option than a custom shell script for this?

einpoklum
  • 2,478
  • 1
  • 13
  • 30
  • @DocBrown: Add an explanatory note. – einpoklum Jun 11 '20 at 16:03
  • Related question: https://softwareengineering.stackexchange.com/questions/161293/choosing-between-single-or-multiple-projects-in-a-git-repository - maybe the top answer applies to your case as well? It says, it the management overhead seems to become too huge, maybe a single repo is the better alternative. – Doc Brown Jun 11 '20 at 16:25
  • @DocBrown: How about now. – einpoklum Jun 11 '20 at 20:43
  • Looks way more focussed now! I cleaned up my comments, and upvoted. – Doc Brown Jun 11 '20 at 21:00
  • 1
    This is for a desktop, rather than something hosted on a remote machine, correct? Have you looked into package management for your platform? APT for debian or RPM for centos based platforms. Each of them handles dependencies to ensure everything gets installed properly. They also handle de-installation for you. Your build system would have to create the packages so you can test them out. – Berin Loritsch Jun 12 '20 at 01:46
  • @BerinLoritsch: I missed this comment when you made it. It could be for a desktop, but also possibly a server machine (that is accessed remotely by users and admins for the most part). Creating APT/RPM packages is an interesting approach. Please consider making that into an answer. – einpoklum Oct 26 '20 at 15:10

0 Answers0