Why are programs not distributed in compiled format?

6,657

Solution 1

Let's analyse the factors...

Analysis:

DEPENDENCIES ACCORDING TO PLATFORM: There are some issues that arise in an environment where developers are creating and maintaining several architecture-specific variants of an application:

  • Different source code is required for different variants — Different UNIX-based operating systems may use different functions to implement the same task (for example, strchr(3) vs. index(3)). Likewise, it may be necessary to include different header files for different variants (for example, string.h vs. strings.h).

  • Different build procedures are required for different variants — The build procedures for different platforms vary. The differences might involve such particulars as compiler locations, compiler options, and libraries.

  • Builds for different variants must be kept separate — Since there is a single source tree, care must be taken to ensure that object modules and executables for one architecture do not become confused with those for other architectures. For example, the link editor must not try to create an IRIX–5 executable using an object module that was built for SunOS–4.

  • Every operating system has its own linking management scheme and must prepare the ELF (Executable and Linking Format) file as it needs it.

  • The compiler will generate a build that is a sequence of instructions, and distinct architectures mean different instruction sets (Comparison of Instruction Set Architectures). So, the output of the compiler is distinct for each architecture (Ex: x86, x86-64, ARM, ARM64, IBM Power ISA, PowerPC, Motorola's 6800, MOS T 6502, and so many others)

SECURITY:

  • If you download a binary, you can not be sure if it does what it says it does, but you can try to audit the source code and use a self compiled binary in your system. In spite of this, the user Techmag made a good point in his comment, auditing the code requires knowledgeable and competent coders to assess code and is not a safety guarantee.

MARKET: In this section there are a lot of factors, but i'll try to resume it:

  • Not every company aims to reach all the platforms, it depends on the market and the platforms popularity and what they want to sell.

  • Free software have the spirit of making software as widely available as possible, but it doesn't imply that the software is designed for every platform, it depends on the community who supports it.

Conclusion:

Not every software is designed for every platform. Providing binaries for all the architectures and platforms implies to compile it, test it, and maintain it for all the platforms. That's more work that is sometimes just too expensive, and can be avoided if the user compiles it in its own platform. Also, the user will be aware of what he's executing.

Solution 2

There is such a variety of platforms and software environments both *nix and other, that the software may be able to be run on, that allowing you to build an application (or library to use with applications) is the only realistic way to support as many combination of those components as a "good" software item does. Of course, licences such as the GPL require the source code to be available - so even if the software doesn't work properly it is usually possible (though it might be tricky to comprehend what is wrong and how to fix it) for the user or some third party to dive in and correct it even if the creator won't/can't/no longer exists to do so.

Distributing software as source code also allows independent verification that the software does what it claims to and does not do something nasty instead of or as well - which despite reducing the level of trust one has to have in the creator, actually enhances it!

Solution 3

First, your question is based on a flawed premise. Programs are distributed in compiled format!

The normal way to install software on Ubuntu, like on most other Linux distributions, and more generally on most Unix variants, is to install a package. On Ubuntu, you open the software center or some other package manager and browse the available software. When you select a package for installation, the binaries (if the package contains a program) are downloaded and installed on your machine.

By default, the package manager offers you packages made by the distribution maintainers. You can also find third-party sources of packages; Ubuntu offers PPA as a standardized way for third parties to offer packages.

Downloading the software from the author in compiled form is a last resort. You only need to do that if the software isn’t popular enough to be packaged, or if you absolutely need the latest version which hasn’t been packaged. Most people never need to do this.

When software isn’t packaged for a distribution, it’s often distributed in source form rather than in binary form. There are two main reasons why this happens often in the Linux world, but rarely in the Windows world. One reason is that the proportion of open source programs on Linux is much higher. Obviously, if a program’s source code is not available, the only form of distribution is a binary. The other reason is that the Linux world is much more diverse. Different binaries are needed for each set of incompatible library versions, which often means different binaries for each version of each distribution. Windows “solves” this by having each package author distribute the libraries they use along with the program (consequence: your computer stores many copies of each library, one per program that uses it; if a bug is fixed in a library, each program that uses it has to ship an update), and by releasing a new version of the operating system only every three years or so. Unix has much more diversity and much more timely bug fixing habits, and solves the library distribution issues by building different binaries for different distributions.

Solution 4

The original reason for distribution as source certainly was platform diversity; the Linux community has continued that methodology both for that and for new, partially political, reasons.

Unlike e.g. Windows, Linux has historically never bothered to keep any ABI (application binary interface) stable across long periods of time - keeping the possibility to innovate on aspects like executable formats, library APIs, and support for new hardware platforms was/is considered more important.

Commercial operating systems achieve long-term application compatibility by being very disciplined about innovation; a new feature/software interface always needs to be added IN ADDITION to an old one - requiring two things to be maintained, and the price of changing anything after release needs to be considered very high. Alternatively, you can embrace the fact of planned application obsolescence together with anyone writing software for your OS (this is not hinting at MS but another OS vendor).

Achieving a long-term stable platform for software distributed in binary-only form (outside of a given Linux distribution) would be even considered undesirable by some elements of the Linux community. As an unapologetic user of both platforms, I am not saying that is good or bad; it is like it is.

Solution 5

Linux runs on more than just one particular CPU platform. If you distributed ELF files (or any other kind of raw executable), there'd be a chance that some versions of Linux couldn't run the software. In the spirit of making software as widely available as possible, using the source code is preferred. For example, Linux runs on Sparc, Intel, AMD, ARM, and other types of processors.

If the ELF file targeted specifically Intel processors, for example, then other types of hardware couldn't run the software. ELF is platform independent, but the code it hosts needs to conform to a platform's machine code. You'll notice how many distributions have similar packages (e.g. _386 and _586 packages when it supports different processors)-- you have to install the correct ELF file to get the correct operation.

Similarly, if I decide to build a custom Linux version that uses different interrupts, linkers, etc, then I still need the source code to compile the code. Even if the source code doesn't have platform-specific build instructions, each platform is different and may not run an ELF from a different system.

Share:
6,657

Related videos on Youtube

Akanksha
Author by

Akanksha

Updated on September 18, 2022

Comments

  • Akanksha
    Akanksha almost 2 years

    But they give instructions like

    cd downloaded_program
    ./configure
    make install
    

    This creates the ELF that is needed, and probably some .so files.

    Why not put those inside a zip file for download, like with windows apps? Is there any reason why they need to be compiled by the user?

    • mikeserv
      mikeserv over 8 years
      that is how source code is distributed. you tagged this with ubuntu - have you tried the apt stuff?
    • Knud Larsen
      Knud Larsen over 8 years
      Ubuntu : The 40,000 most common programs : $ sudo apt-get install [name] . The more rare software : Some must be built from source with {cmake .. && make, ./configure && make, waf, scons, etc. ~10 build options}.
    • Knud Larsen
      Knud Larsen over 8 years
      You have three Windows© versions, and ~100 "Linux OS" versions. Impossible to maintain and store more than the (40,000) most common programs.
    • Alessio
      Alessio over 8 years
      this question is just wrong. most software IS distributed in binary format, typically in .rpm or .deb or .tgz packages. The source is also distributed for those who want to compile it themselves or examine it or modify it, or package it for one or more distros. Nobody uses .zip for distributing binaries on linux because .zip files do not support essential information such as user, group, and permissions for the files they contain.
    • user2338816
      user2338816 over 8 years
      I have yet to need to compile any Linux program that I've wanted to run. Executables have always been available... so far.
    • phyrfox
      phyrfox over 8 years
      @cas Because they've already compiled the source on your behalf for known platforms. But, say, you're running Linux on your Xbox One/PS4/Wii U/whatever. Those may be unexpected platforms for the distro, and as such, the deb/rpm/tgz files may not be enough. If the distro expects your hardware profile to be common, it's pre-compiled, and if not... time to compile it yourself.
    • Ludwig Schulze
      Ludwig Schulze over 8 years
      @phyrfox well, I don't think there's a platform that linux can run off that doesn't have a distro to match... and after that is up to each distro to have the most needed/demanded packages in an accessible way to use.
    • don bright
      don bright over 8 years
      Kernels are not binary compatible with each other, despite Linus' recent efforts. You can get a segfault if you run an ELF built on a newer kernel on an older kernel. Also "make" typically uses a lot of shared libs. (run the ldd program on the generated executable to see all the shared object dependencies). Those shared libs are usually different versions on different linux machines, so the executable won't be compatible. Even if you statically link, the kernels (as noted) aren't compatible so there ya have it. Some game programmers have figured out solutions, but they aren't easy.
  • phyrfox
    phyrfox over 8 years
    Windows depends on the underlying architecture as well. ARM-enabled Windows apps don't run on regular laptops/desktops, and so on. That's the main reason why Linux has much better hardware support-- because code's written to be compiled on any platform that has a sane Linux implementation, while Windows depends on known types of hardware being present.
  • mchid
    mchid over 8 years
    Which is exactly why you'd probably be running 32 bit firefox just like many other programs on a "64 bit" windows OS whereas 64 bit linux usually runs a 64 bit application.
  • slebetman
    slebetman over 8 years
    @Sobrique: Or even mention processor model differences per se - 10 years ago there use to be more than the 2 types we have today and Linux ran on almost all of them (I myself ran Linux on PowerPC). It's still partly relevant today with x86, AMD64 (otherwise known as x86-64) and ARM. MIPS is still quite popular today among people who can manufacture their own chips because it's completely patent free by now.
  • Facundo Victor
    Facundo Victor over 8 years
    Sorry for the delay! Thank you both! I added some references to cpu architectures, also i added a link to a comparison list. I don't want the answer to be so big. But yes, it's very relevant!
  • Techmag
    Techmag over 8 years
    The security comment implies that the reader/installer knows how to read and understand the code. Given that the shellshock vulnerability survived decades without being noticed I would respectfully suggest that is a somewhat false belief. It does allow for knowledgable and competent coders to asses code yes but it's not as much a real security deterrent as advertised. It might actually have the opposite effect. State and organized crime funded hackers are likely contributing code for all manor of open source libraries/projects these days in the hopes of planting the next shellshock opening...
  • Facundo Victor
    Facundo Victor over 8 years
    You are right! I modified the answer for reflect that. I tried to not lose the focus on the main objective of the answer. Thank you Techmag!
  • ctrl-alt-delor
    ctrl-alt-delor over 8 years
    @Sobrique Where did you get that 95% figure from? Last time I looked there was 4 ARM CPUs for every 1 x86. This was a few years back, before everyone started using smart-phones with ARM processors. So if we assume that there are no other processors that that is 20%.