20

The X Window System is 25 years old, it had it's birthday yesterday (on the 15'th).

As you probably are aware of, one of it's most important features is the separation of the server side and the client side in a way that neither Microsoft's, Apples or Wayland's windowing systems have.

Back in the days (sorry for the ambiguous phrasing) many believed X would dominate over other ways to make windows because of this separation of server and client, allowing the application to be ran on a server somewhere else while the user clicks and types on her own computer at home.

This use obviously still exists, but is marginalized at best. When we write and use programs that run on a server we almost always use the web with it's html/css/js.

Why did the web win, and X not? The technologies used for the web (said html/css/js) are a mess. Combined with all the back-end-frameworks (Rails, Django and all) it really is a jungle to navigate thru. Still the web thrives with creativity and progress, while remote X apps do not.

  • 7
    The two are not even remotely comparable. X-server connections allow me to run a remote application, and view it's GUI locally, which is a *completely* different use-case from allowing me to load remote resources into a local client. – Martijn Pieters Sep 16 '12 at 11:27
  • 3
    I don't agree that there is a difference. When I connect my web client (browser) to a server (locally or remote) I can view the GUI of my (web-)app. Just like I can view the GUI of my app with an X session. – Martin Josefsson Sep 16 '12 at 11:33
  • 4
    Try writing an X11 program and compare it with an HTML page - also compare bandwidth needed. Additionally, WWW did not replace X11, it replaced Gopher. –  Sep 16 '12 at 11:34
  • It's all resources loaded from the remote server. It runs these resources in a local client on *your* CPU. X-Windows clients run on a remote server. – Martijn Pieters Sep 16 '12 at 11:35
  • 2
    Pieters: Sure, the page is rendered on the client, and JS runs on the client, but that is merely technicalities. More often than not, code runs on the server side (php, java, .net, python, ruby, whatever). In practice they are both interfaces for apps to run on a server and be showed on a client. X and web does it in different ways, but that is the gist of it. – Martin Josefsson Sep 16 '12 at 11:39
  • No, it's not. X does not give you the *option* of running code on the X server. Thus the scalability and performance picture is completely different. – Martijn Pieters Sep 16 '12 at 11:48
  • 1
    @Thorbjørn Ravn Andersen: Writing a web app is not only making a html document. You have to learn at least one of the back-end frameworks and languages, js, html and css. And then you have to dive into AJAX to make it feel quick, and learn all sorts of magic tricks. It's not super easy nice web vs super hard impossible x. And they do/did compete in the same space. Apps that run on the server but are viewed on the client. – Martin Josefsson Sep 16 '12 at 11:49
  • @Martijn Pieters: 'Thus the scalability and performance picture is completely different' there is an answer, thank you! – Martin Josefsson Sep 16 '12 at 11:51
  • 1
    @MartinJosefsson: Noone considered the two as competing; they have different use cases (run a process on a remote server while allowing you to show it's GUI on your server) vs. information sharing (the *original* use case for HTTP). The architectures for both technologies spring from those use-cases. It's like asking why noone is using spread-sheets for word processing. – Martijn Pieters Sep 16 '12 at 11:56
  • X creates complexity and gives us decreased performance, even if deployed locally. – Badar Sep 16 '12 at 12:00
  • 15
    Because the technology did not pass validation by the adult entertainment industry, a required step in mainstream adoption of a technology (that's a fancy way of saying that pictures of naked women did not become available on X systems quickly enough). – Sergey Kalinichenko Sep 16 '12 at 12:02
  • 1
    @MartijnPieters or in other words, why did RDP win the space of remotely viewable desktops. I know the 2 are different technologies that work in different ways, but that's effectively a much better comparison on usage. – gbjbaanb Sep 16 '12 at 12:09
  • Badar: Well, that is true for the web. My html/css/js app that I run on apache on localhost runs slower than the same program made with C or Ruby. But performance is not the deal breaker here, obviously. If it was, we would not use graphical interfaces at all. – Martin Josefsson Sep 16 '12 at 12:14
  • @MartijnPieters: I absolutely get your point, and it is no doubt valid. But even if information sharing was what we used the web for back in 1993. But X could do remote apps before that, and nowadays the web is more of an interface for remote apps than it is merely publishing documents. So today they both compete in the same field, even if they originally did not. – Martin Josefsson Sep 16 '12 at 12:17
  • @Badar: "X creates complexity and gives us decreased performance" - decreased compared to what? Web pages? How do you measure that? – Bryan Oakley Sep 16 '12 at 12:19
  • @MartijnPieters Also, is not using a markup language as app interface kind of the same as doing word processing in a spreadsheet? It obviously works, but is a tad strange. – Martin Josefsson Sep 16 '12 at 12:20
  • @MartinJosefsson Today, yes. Then, no. –  Sep 16 '12 at 12:25
  • @BryanOakley: "The X Window System, like the Aqua and Aero Glass GUI environments, tries to do too many things within one unified package. X.Org is still about thirteen million lines of code, though.This creates significant complexity, which in turn masks security issues." – Badar Sep 16 '12 at 12:31
  • @gbjbaanb because it works well with Windows? –  Sep 16 '12 at 12:39
  • @MartinJosefsson have you actually TRIED writing an X11 program? With the toolkits available in the mid-90'es? –  Sep 16 '12 at 13:29

6 Answers6

23

It seems utterly obvious and fundamental now, but the killer innovation of the world wide web was the hyperlink. Even if X wasn't completely unusable over a modem link, its inability to launch a completely new process on a completely new server via a single click would hamper its adoption for that sort of use case.

Karl Bielefeldt
  • 146,727
  • 38
  • 279
  • 479
  • 1
    This may very well be the correct answer. Also that web clients ran on Apple and Microsoft OS's too. – Martin Josefsson Sep 16 '12 at 21:33
  • The hyperlink wasn't an innovation of the World Wide Web. It was implemented many times before, such as in Apple's Hypercard, a popular program in the 80s and 90s with many uncanny similarities to a Web Browser. The concept of hypertext and hyperlinks goes back to the 60s, with Project Xanadu, and it has been implemented many times in many formats before Tim Berners-Lee finally created his own network-based implementation of hypertext at CERN in the early 90s. – Charles Salvia Sep 16 '12 at 23:04
  • 3
    @CharlesSalvia: The breakthrough of HTML hyperlinks was due to the URL. In particular its Universal aspect: global, with just enough central authority to work, and not tied to one specific media type or technology. Your previous technologies were far, far less universal. – MSalters Sep 17 '12 at 13:10
18

Because X requires you to have a CS degree to write an application. While Web requires that you have the ability to type (not even that).

Especially in the early days when web was just html. You could open a terminal and build a working display in 10 minutes and then interactively improve it with instant feedback. This low bar of entry spurred massive user uptake. Building a X-Server application on the other hand is non-trivial task even for experienced programmers.

It has taken 10 years for the Web to be a direct competitor to X applicatoin in terms of functionality and provide the same GUI like abilities. This functionality has been added to the language stack over time allowing developers to master one set of features before the next was added; so this gradual expansion of technological complexity has maintained the low bar (for people who are already in the field and there are lots of them). Jumping into the field now is a lot harder than it was 10 years ago but it is still possible and the instant feedback of the web makes learning more gratifying (humans need quick gratification to reinforce their drives).

Cost is another driver. The actually cost of learning enough programming skills to develop a X-Server is significant. Then additionally the availability of servers to run your application on has driven the cost up. Learning to write HTML was practically nothing to get "Hello World" page up and running and internet service providers provided free hosting to inspire you to get a web connection. So you could practice for free. When you eventually needed business hosting the availability of hosting companies has grown and the cost has always been relatively cheap.

Martin York
  • 11,150
  • 2
  • 42
  • 70
  • 1
    You assume that in order to write an app that is used over X you need to understand the X api. But just as you don't need to understand HTTP in order to write a web app, you don't need to understand X to write an app that runs under X. You could write it in one language, the one you preferred, and just have a GTK library on top. Way easier than to learn html and css and js and a serverside language. The gist of it: just as you don't need to write a http server to publish a web site, you don't need to write a X server to serve a X app. – Martin Josefsson Sep 16 '12 at 17:59
  • I disagree with your analysis there. Though you have a point in that writing a modern web application is now nearly as complex as writing X application was 10 years ago. To write an X-Application is still not a trivial processes. Its just like writing a windows application. Well beyond the ability of anybody that has not done significant programming experience. On the other hand putting up an HTML page is trivial and can be done in 10 minutes (even by a beginner). Thus leads to quick re-enforcement and the ability to quickly experiment. This makes it much lower bar to entry. – Martin York Sep 16 '12 at 19:31
  • GTK wasn't available until well after the web was established. – user16764 Sep 16 '12 at 19:33
  • @user16764: That's not true. I was using GTK in 1997 (not sure when they first started but before that). The web (as in HTML/HTTP) may have been up then but well established not so much. I mean web browser were only just being brought into the main stream in 92 (The first time I saw one). X has several other window managers that were usable before that though. I remember using twm (tom's window manager I believe) and one other slightly higher level one (which I forget) but there were plenty to choose from (too many) in 90 (and they were available before that (I think)). – Martin York Sep 16 '12 at 19:42
  • @LokiAstari: You're confusing Window Managers and GUI libs. While there's a definite overlap (GNOME/Gtk, KDE/Qt) they're certainly not identical. Even with window managers you still had worlds of pain. – MSalters Sep 17 '12 at 13:08
  • @MSalters: Yes good point. With GTK it was easier but still non trivial. You still need to be a programmer with experience. So maybe only a single Continent of pain. – Martin York Sep 17 '12 at 16:24
13

The answer is that "many technologies are adopted for arbitrary historical or socio-political reasons rather than technical reasons." The best solution for a given problem does not always become the dominant technology. (In fact, it rarely does.)

In 2012, where HTTP servers are being used to create interactive applications on par with Desktop applications, the comparison between HTTP and X is interesting. In hindsight, X is probably a better technology to develop rich, interactive network-deployed applications. Interactive Desktop-like applications don't map well to a stateless, document-oriented technology like HTTP, and this mismatch has historically resulted in all sorts of work-arounds (hacks) to create state, like cookies, sessions, etc.

But the original purpose of HTTP wasn't to develop stateful Desktop-like apps. It was to retrieve documents and display information - information which could link to other documents that could also be instantly displayed. The idea of a linked collection of documents goes way back to the 1960s with Theodore Nelson's "Project Xanadu". The Web was supposed to be an implementation of Nelson's concept of hypertext, which was an attempt to computerize the printed page - like the encyclopedia or the newspaper - allowing the user to instantly "jump" from one article to another with a single click.

Many iterations of this idea have come and gone, such as Apple's Hypercard, which implemented the concept of hypertext/hyperlinks, but was never deployed over networks. The World Wide Web was CERN's network-based implementation of the concept of hypertext, and it likely took off because Tim Berners-Lee released his browser code library for free, allowing others to experiment with it. This eventually led to Marc Andreesen's Mosaic browser, the predecessor of Netscape. And the rest is history.


But... as with so many technologies, new possibilities began to emerge that the original designers of HTTP or hypertext didn't really think about too much. The web became commercialized and people started to develop websites that featured stateful interactivity, like shopping carts and logins. It became more and more apparent that the stateless and document-oriented nature of HTTP wasn't very well suited to Desktop-like applications. But at that point, it was just too late. Everyone was already using HTTP. So, here we are today, with various hacky AJAX applications trying their best to pretend they are Desktop apps.

Charles Salvia
  • 7,342
  • 1
  • 35
  • 33
4

The technologies might try to solve similar problems now, but they sure didn't in the past.

Current HTML stack evolved over time from really simple text-document transfer, through "visual" documents with little scripting, into full-featured application platform.

At the time when HTML began, no one could ever dream about connecting to remote computer and running graphical applications there. Only after internet got better latency and throughput this became possible. Yet at that time, HTML was already ever-present. Everyone knew that this was way to give customers and users access to graphical application, that run on the remote machine.

And as with every "free" system, It became impossible to "reset" the whole thing and start anew to do it better this time. That's why we need to shut up and use HTML/CSS/JS and only wish people supporting it will finally wisen and bury it along with it's years long legacy.

This answers the question "Why did web win?". There wasn't any competition, web won before everything even started.

Euphoric
  • 36,735
  • 6
  • 78
  • 110
  • 1
    At the time when HTML began, there was already server-side computing, with the NSCA HTTP server and its SGI. Most applications delivered text but I remember one that was able to render B/W custom maps, an ancestor of google maps. – mouviciel Sep 16 '12 at 13:01
  • [Image maps](http://en.wikipedia.org/wiki/Image_map) indeed date back to early years of the last decade of the previous century. – MSalters Sep 17 '12 at 13:14
1

I agree that, in principle, the two are similar. If you ask the question "how can we run code on a server but provide visualization on a remote client?", it's reasonable to think that independent teams could come up with either solution.

I suspect the reason one is more popular than the other in certain aspects is because the two approach the same problem from completely different perspectives. X is a technical solution to a technical problem, but the web evolved as a need to solve a social problem -- how can I take resources from a remote server and display it on my local machine, and do it in a way that is easy and convenient?

The web "won" because it solved a problem experienced by more people. Think of a car analogy: both luxury sedans and trucks ostensibly solve the same problem: how to transport something from one place to another.

The truck solved the technical problem of literally how to haul something from point A to point B, and for that it works quite well. The passenger car evolved as a need for people to be comfortable as they travel, and to carry more people and less manure. It became a necessity that required conveniences. Thus, over time, the number of passenger cars far, far outweighed the number of pickup trucks on the road (I'm guessing, based on observation of Chicago traffic, maybe it's different in Texas? :-)

So, like the car/truck analogy, the web and X11 both arguably solve the same technical problem, but they serve completely separate purposes.

Bryan Oakley
  • 25,192
  • 5
  • 64
  • 89
1

You are comparing apples to pears. X windows are about separating the rendering of screen content into a local client, which could be connected by a thin wire to the source of the content. It's really an extension of the computational model from the "glass tty" era to the domain of high quality graphics. X originated in the era when PCs were still pretty wimpy, and most of the real computation was done on unix or mainframe boxes. The idea was to harness the power of relatively cheap "X terminals" and relatively slow networks to make these serious computational reasources available graphically.

The reasons this didn't win on Macs and PCs is that their development was always driven by the desire to support high end graphics in local applications, including games, editors and business graphics. Supporting network resident applications is a recent afterthought.

ddyer
  • 4,060
  • 15
  • 18