33

As far as I know, so-called "fat binaries"--executable files that contain machine code for multiple systems--are only really used on Apple PCs, and even there it seems like they only used them because they needed to transition from PowerPC to x86.

These days a lot of software is cross-platform, and it seems like making a single fat binary would be in many ways simpler than keeping track of a dozen or so different downloads for each combination of operating system and architecture, not to mention somehow conveying to the customer which one they want.

I can come up with plenty of guesses as to why this approach never caught on, for instance:

  • A lack of cross-compilation tools making multi-OS binaries infeasible
  • You need to test the code on each OS anyway, so you already have to have systems that can compile natively for each OS
  • Apparently 32-bit programs "just work" on 64-bit machines already
  • Dynamic linking works differently on each OS, so a "fat library" might not work even if a "fat application" would

But since I always work with a library or framework that hides all these OS-specific and architecture-specific details from me, I don't know how true any of that really is, or if there are even more issues I don't know about. So, what are the actual reasons why fat binaries aren't generally used to create multi-architecture and/or multi-OS software? (outside of Apple)

Ixrec
  • 27,621
  • 15
  • 80
  • 87
  • 5
    Is there a serious problem this would solve? When I go to the download page for major cross-platform applications they automatically detect the OS I'm using and point me in the right direction. Linux users have package managers. Seems to me like all this would do is bloat download sizes. – Doval Mar 01 '15 at 14:02
  • 4
    I don't know a lot about fat binaries, but don't they need support from the OS (specifically the loader)? Then I don't see the appeal for "multi-OS software" specifically, as there's no way the OS vendors would agree on an interoperable format. –  Mar 01 '15 at 14:18
  • 7
    Because we have byte code and virtual machines that solve this problem better. – Robert Harvey Mar 01 '15 at 14:59
  • 1
    A few applicatons i've seen stuff the executables for each supported system in one "fat zip file"; this seems like it would be more useful. – Weaver Mar 01 '15 at 15:11
  • 2
    Makes me wonder, is it even possible to make a polyglot executable file that works both in Windows and some *NIX? – aaaaaaaaaaaa Mar 01 '15 at 19:02
  • @Robert Harvey - Make this an answer, because it is 'the answer' - the rest is just noise - if we did not have VM's and Byte code, those issues would be addressed. – mattnz Mar 01 '15 at 20:37
  • @RobertHarvey That was actually one of my guesses; somehow I forgot to include it in the question... – Ixrec Mar 01 '15 at 20:45
  • 2
    For the historic record, fat binaries date from the 68k -> PPC transition, not the PPC -> x86 transition. – Mark Mar 02 '15 at 08:36
  • And fat binaries are still in use, for apps running either in 32 or 64 bit mode. – gnasher729 Jul 17 '17 at 16:49
  • @Doval Yes: dependency management. Users can pick apps, but if there's a binary for text parsing that is depended on by a library which is needed for another library which is needed by a 3rd library which is being used in a programming language... and the dev is trying to develop the app for every possible architecture that their binaries support, it becomes too much manpower to manually track every architecture version of every binary. Some things slip through the cracks, stuff breaks, and devs quickly become familiar with the term "dependency hell" – Jeff Hykin Feb 06 '21 at 15:03

3 Answers3

13

Internet age distribution logistics disincentivizes fat binaries in two ways:

  1. The point of sale does not involve physical goods and therefore favor fewer SKU's as is the case when products compete for retail shelf space and customers have limited opportunities to make a purchase.

  2. The costs of bandwidth favors delivering just the minimum necessary bits for a particular software package. Shipping a fat binary down the wire degrades both customer experience and seller infrastructure efficiency.

Fat binaries made more sense when software was shrink wrapped physical media.

ben rudgers
  • 525
  • 2
  • 10
  • 2
    For software deployments, the cost of bandwidth is almost inconsequential nowadays anyway. If it's still a problem, just wait 5 more minutes. – Robert Harvey Mar 02 '15 at 00:37
  • 5
    @RobertHarvey, but those are 5 minutes I'd rather not wait. – Arturo Torres Sánchez Mar 02 '15 at 15:46
  • Depending on the software a lot of it is architecture independent (graphics, video, fonts, documents). These only need to be transferred once. Only the actually executable will take up more space. Having just downloaded an executable for the wrong architecture I can say, that I would much rather have a single binary that worked everywhere than having to find out exactly which file will run on my architecture - even if I had to wait 30% longer for the download. I am sure my mum would agree, too. – Ole Tange May 02 '20 at 00:16
  • @ArturoTorresSánchez How long time are you willing to spend on figuring out which file is for your architecture? Let us assume the download site uses a naming scheme that you have never seen before, so you have no idea which file will run on your architecture. – Ole Tange May 02 '20 at 00:17
  • Arturo, how much time are you willing to spend next year when your brand new computer doesn’t run half the software on your old computer? – gnasher729 Jun 21 '22 at 21:55
11

A fat binary approach makes most sense if:

  1. Both architectures coexist on the same system
  2. Everything else is more or less the same for all architectures

That's why they are not used for cross-platform code (both criteria don't apply), or to support different Linux distributions with one binary (1. doesn't apply, 2. applies to a certain degree).

On Linux, both criteria would still apply if you want to support both 32 and 64 bit on a single Linux distribution. But why bother, if you already have to support multiple distributions?

On Windows, the transition from 16 bit to 32 bit happened initially with the introduction of Windows NT, which was major deviation from the 16 bit Windows world in many regards (virtual memory, multi-user access control, API changes...). With all these changes, it was better to keep the 32 and 16 bit worlds separate. NT had already the concept of "subsystems" support different OS "personae" (Win32, POSIX), so making Win16 a third subsystem was a straightforward choice.

The Win32 to Win64 transition didn't involve similar major changes, but Microsoft used a similar approach anyway, probably because it was proven and tried.

oefe
  • 951
  • 7
  • 11
3

Part of the reasons why fat binaries did not succeed is that there is more than the ABI & processor (actually, instruction set) specifications to invalidate a binary executable. A binary executable often depends a lot on other resources, in particular dynamic libraries (see the DLL hell), external services (think of DBMS like PostGreSQL ....), system configuration (e.g. location of configuration files under /etc/ on Linux), etc. etc....

Just for Linux/x86-64 it is in practice difficult to make a binary executable able to run on every Linux distributions (because it is often tied to specific versions of libc or of libstdc++). FatELF exists but is not very successful.

Even with a well defined ABI and instruction set, the optimization would be different on various processor brands - see the -mtune=native x86 optimization flag of GCC.

Apple partly succeeded in having fat binaries only because they provide a very closed eco-system of computing resources.

Free software is another way to solve your portability concern: If an application is free software (carefully coded for portability), it is quite easily ported to similar systems. And even if the original source code does not work as intended on your system, you could adapt it (or pay someone to do the work) usually reasonably easily (of course, free software tied to particular OS or ABI or processor is not easy to port, you'll pay more efforts for that). And standards like POSIX or Linux Standard Base also help.

You could pay (or ask) someone to port some (free) software with available source code, but it is unrealistic to port a binary executable.

At last, several frameworks exist to help porting on several operating systems (provided source code is available), e.g. Qt & POCO.

Even using a well specified bytecode like the JVM is not always a guarantee of portability: some Java applications are well known to not be portable (e.g. because they expect some particular file hierarchy and naming).

BTW, computer systems are probably much less heterogeneous today than in the 1980s or early 1990s (or in the mainframe era).

At last, fat binaries are fat: you will spend a lot of resources (build time, bandwidth, executable size) for a portability issue that might not concern a lot of people. Remember the aphorism: "there is no portable software, only software which has been ported" (to some particular systems).

Basile Starynkevitch
  • 32,434
  • 6
  • 84
  • 125
  • 4
    I dispute the notion that making software "free" also makes it easily portable. – Robert Harvey Mar 01 '15 at 15:22
  • 3
    It does not make software portable, but it does enable someone to spend efforts on porting it to some other platform (which you cannot realistically do if you don't have access and right to modify the source code) – Basile Starynkevitch Mar 01 '15 at 15:22
  • 2
    @RobertHarvey there *may* be no fundamental link in principle, but there are several entirely pragmatic reasons why Free Software has spread very widely across so many platforms and created so many cross-platform build tools and tool chains. – itsbruce Mar 01 '15 at 15:31
  • A closed-source application that depends entirely on open-source, portable frameworks and libraries will be somewhat easier to port to other platforms by the closed-source application's proprietor. I say this from my first-hand experience of bliss after finding out that "coding to OpenCV style" is all you need (for image processing needs). Of course the GUI still requires other frameworks, such as Qt, etc (although I don't have first-hand experience of that). – rwong Mar 01 '15 at 15:57
  • 4
    @RobertHarvey It doesn't. Free software that is an absolute mess to port it still an absolute mess to port - you just increase the surface in which you _might_ catch someone that cares about using it on a particular architecture or OS enough to port it, or suggest a path that might make porting something that doesn't make people bleed. – Tim Post Mar 01 '15 at 16:36