Is it that DirectX is easier or better than OpenGL, even if OpenGL is cross-platform? Why do we not see real powerful games for Linux like there are for Windows?
-
55False premise - where is evidence that game developers prefer windows? I think @user17674 had it more accurately - they want to develop for platforms so they can sell more and make more money:-) – Rory Alsop Mar 22 '11 at 00:04
-
258Virus developers also prefer Windows. It's all about the user base. – CesarGon Mar 22 '11 at 07:45
-
4Why Windows programmers prefer DirectX over OpenGL is probably a more interesting question historically. As the link to the Carmack interview below might indicate, DX used to be loathed. It would be interesting to know how it kept enough users for MS to support it until Xbox and modern rewrites made it more popular. Or maybe it was just always good enough so folks stuck with it. – CodexArcanum Mar 22 '11 at 21:14
-
100Why do people rob banks? *Because thats where the money is*. – GrandmasterB Jun 29 '11 at 19:27
-
7@CesarGon: It is not only because of the user base but also for the ease of development. – Giorgio Aug 02 '12 at 17:15
-
3This [blog article](http://blog.wolfire.com/2010/01/Why-you-should-use-OpenGL-and-not-DirectX) discusses why they (a company) use OpenGL instead of DirectX, it's a must read. – Vicfred Mar 21 '11 at 23:12
-
2There was recently an [article](http://www.bit-tech.net/news/gaming/2011/03/11/carmack-directx-better-opengl/1) regarding this topic with [John Carmack](http://en.wikipedia.org/wiki/John_carmack). – yojimbo87 Mar 21 '11 at 22:54
-
@Giorgio: Granted; great development tools and reasonable APIs are important too. – CesarGon Aug 02 '12 at 20:33
-
As I understand it, there are two reasons: (1) Most game devs are primarily targeting Windows, and so rationalize there is no reason to use a cross-platform API. (2) OpenGL [may be faster](http://blogs.valvesoftware.com/linux/faster-zombies/) and [more powerful](http://blog.wolfire.com/2010/01/Why-you-should-use-OpenGL-and-not-DirectX), but I recall hearing that the DirectX API is a lot cleaner and easier to use (no link). I, personally, prefer OpenGL. – fouric Jan 28 '13 at 18:01
-
@InkBlend the question is why would you limit yourself to windows if there is a faster, more powerfull and cross-platform API which is OpenGL. Wouldn't make more sense to use it and support all platforms? – M.Sameer Jan 28 '13 at 19:39
-
That OpenGL may be faster is in dispute - some recent benchmarks (see e.g. http://www.g-truc.net/post-0547.html) show that Valve's results may be an isolated case and the opposite may in fact be the case. I'd also consider that Wolfire blog post to be discredited as it contains many factual inaccuracies (I'd even go so far as to call them outright untruths as the author should have been aware of these things) some of which were subsequently retracted. – Maximus Minimus Apr 03 '13 at 00:34
14 Answers
Many of the answers here are really, really good. But the OpenGL and Direct3D (D3D) issue should probably be addressed. And that requires... a history lesson.
And before we begin, I know far more about OpenGL than I do about Direct3D. I've never written a line of D3D code in my life, and I've written tutorials on OpenGL. So what I'm about to say isn't a question of bias. It is simply a matter of history.
Birth of Conflict
One day, sometime in the early 90's, Microsoft looked around. They saw the SNES and Sega Genesis being awesome, running lots of action games and such. And they saw DOS. Developers coded DOS games like console games: direct to the metal. Unlike consoles however, where a developer who made an SNES game knew what hardware the user would have, DOS developers had to write for multiple possible configurations. And this is rather harder than it sounds.
And Microsoft had a bigger problem: Windows. See, Windows wanted to own the hardware, unlike DOS which pretty much let developers do whatever. Owning the hardware is necessary in order to have cooperation between applications. Cooperation is exactly what game developers hate because it takes up precious hardware resources they could be using to be awesome.
In order to promote game development on Windows, Microsoft needed a uniform API that was low-level, ran on Windows without being slowed down by it, and most of all cross-hardware. A single API for all graphics, sound, and input hardware.
Thus, DirectX was born.
3D accelerators were born a few months later. And Microsoft ran into a spot of trouble. See, DirectDraw, the graphics component of DirectX, only dealt with 2D graphics: allocating graphics memory and doing bit-blits between different allocated sections of memory.
So Microsoft purchased a bit of middleware and fashioned it into Direct3D Version 3. It was universally reviled. And with good reason; looking at D3D v3 code is like staring into the Ark of the Covenant.
Old John Carmack at Id Software took one look at that trash and said, "Screw that!" and decided to write towards another API: OpenGL.
See, another part of the many-headed-beast that is Microsoft had been busy working with SGI on an OpenGL implementation for Windows. The idea here was to court developers of typical GL applications: workstation apps. CAD tools, modelling, that sort of thing. Games were the farthest thing on their mind. This was primarily a Windows NT thing, but Microsoft decided to add it to Win95 too.
As a way to entice workstation developers to Windows, Microsoft decided to try to bribe them with access to these newfangled 3D graphics cards. Microsoft implemented the Installable Client Driver protocol: a graphics card maker could override Microsoft's software OpenGL implementation with a hardware-based one. Code could automatically just use a hardware OpenGL implementation if one was available.
In the early days, consumer-level videocards did not have support for OpenGL though. That didn't stop Carmack from just porting Quake to OpenGL (GLQuake) on his SGI workstation. As we can read from the GLQuake readme:
Theoretically, glquake will run on any compliant OpenGL that supports the texture objects extensions, but unless it is very powerfull hardware that accelerates everything needed, the game play will not be acceptable. If it has to go through any software emulation paths, the performance will likely by well under one frame per second.
At this time (march ’97), the only standard opengl hardware that can play glquake reasonably is an intergraph realizm, which is a VERY expensive card. 3dlabs has been improving their performance significantly, but with the available drivers it still isn’t good enough to play. Some of the current 3dlabs drivers for glint and permedia boards can also crash NT when exiting from a full screen run, so I don’t recommend running glquake on 3dlabs hardware.
3dfx has provided an opengl32.dll that implements everything glquake needs, but it is not a full opengl implementation. Other opengl applications are very unlikely to work with it, so consider it basically a “glquake driver”.
This was the birth of the miniGL drivers. These evolved into full OpenGL implementations eventually, as hardware became powerful enough to implement most OpenGL functionality in hardware. nVidia was the first to offer a full OpenGL implementation. Many other vendors struggled, which is one reason why developers preferred Direct3D: they were compatible on a wider range of hardware. Eventually only nVidia and ATI (now AMD) remained, and both had a good OpenGL implementation.
OpenGL Ascendant
Thus the stage is set: Direct3D vs. OpenGL. It's really an amazing story, considering how bad D3D v3 was.
The OpenGL Architectural Review Board (ARB) is the organization responsible for maintaining OpenGL. They issue a number of extensions, maintain the extension repository, and create new versions of the API. The ARB is a committee made of many of the graphics industry players, as well as some OS makers. Apple and Microsoft have at various times been a member of the ARB.
3Dfx comes out with the Voodoo2. This is the first hardware that can do multitexturing, which is something that OpenGL couldn't do before. While 3Dfx was strongly against OpenGL, NVIDIA, makers of the next multitexturing graphics chip (the TNT1), loved it. So the ARB issued an extension: GL_ARB_multitexture, which would allow access to multitexturing.
Meanwhile, Direct3D v5 comes out. Now, D3D has become an actual API, rather than something a cat might vomit up. The problem? No multitexturing.
Oops.
Now, that one wouldn't hurt nearly as much as it should have, because people didn't use multitexturing much. Not directly. Multitexturing hurt performance quite a bit, and in many cases it wasn't worth it compared to multi-passing. And of course, game developers love to ensure that their games works on older hardware, which didn't have multitexturing, so many games shipped without it.
D3D was thus given a reprieve.
Time passes and NVIDIA deploys the GeForce 256 (not GeForce GT-250; the very first GeForce), pretty much ending competition in graphics cards for the next two years. The main selling point is the ability to do vertex transform and lighting (T&L) in hardware. Not only that, NVIDIA loved OpenGL so much that their T&L engine effectively was OpenGL. Almost literally; as I understand, some of their registers actually took OpenGL enumerators directly as values.
Direct3D v6 comes out. Multitexture at last but... no hardware T&L. OpenGL had always had a T&L pipeline, even though before the 256 it was implemented in software. So it was very easy for NVIDIA to just convert their software implementation to a hardware solution. It wouldn't be until D3D v7 until D3D finally had hardware T&L support.
Dawn of Shaders, Twilight of OpenGL
Then, GeForce 3 came out. And a lot of things happened at the same time.
Microsoft had decided that they weren't going to be late again. So instead of looking at what NVIDIA was doing and then copying it after the fact, they took the astonishing position of going to them and talking to them. And then they fell in love and had a little console together.
A messy divorce ensued later. But that's for another time.
What this meant for the PC was that GeForce 3 came out simultaneously with D3D v8. And it's not hard to see how GeForce 3 influenced D3D 8's shaders. The pixel shaders of Shader Model 1.0 were extremely specific to NVIDIA's hardware. There was no attempt made whatsoever at abstracting NVIDIA's hardware; SM 1.0 was just whatever the GeForce 3 did.
When ATI started to jump into the performance graphics card race with the Radeon 8500, there was a problem. The 8500's pixel processing pipeline was more powerful than NVIDIA's stuff. So Microsoft issued Shader Model 1.1, which basically was "Whatever the 8500 does."
That may sound like a failure on D3D's part. But failure and success are matters of degrees. And epic failure was happening in OpenGL-land.
NVIDIA loved OpenGL, so when GeForce 3 hit, they released a slew of OpenGL extensions. Proprietary OpenGL extensions: NVIDIA-only. Naturally, when the 8500 showed up, it couldn't use any of them.
See, at least in D3D 8 land, you could run your SM 1.0 shaders on ATI hardware. Sure, you had to write new shaders to take advantage of the 8500's coolness, but at least your code worked.
In order to have shaders of any kind on Radeon 8500 in OpenGL, ATI had to write a number of OpenGL extensions. Proprietary OpenGL extensions: ATI-only. So you needed an NVIDIA codepath and an ATI codepath, just to have shaders at all.
Now, you might ask, "Where was the OpenGL ARB, whose job it was to keep OpenGL current?" Where many committees often end up: off being stupid.
See, I mentioned ARB_multitexture above because it factors deeply into all of this. The ARB seemed (from an outsider's perspective) to want to avoid the idea of shaders altogether. They figured that if they slapped enough configurability onto the fixed-function pipeline, they could equal the ability of a shader pipeline.
So the ARB released extension after extension. Every extension with the words "texture_env" in it was yet another attempt to patch this aging design. Check the registry: between ARB and EXT extensions, there were eight of these extensions made. Many were promoted to OpenGL core versions.
Microsoft was a part of the ARB at this time; they left around the time D3D 9 hit. So it is entirely possible that they were working to sabotage OpenGL in some way. I personally doubt this theory for two reasons. One, they would have had to get help from other ARB members to do that, since each member only gets one vote. And most importantly two, the ARB didn't need Microsoft's help to screw things up. We'll see further evidence of that.
Eventually the ARB, likely under threat from both ATI and NVIDIA (both active members) eventually pulled their head out long enough to provide actual assembly-style shaders.
Want something even stupider?
Hardware T&L. Something OpenGL had first. Well, it's interesting. To get the maximum possible performance from hardware T&L, you need to store your vertex data on the GPU. After all, it's the GPU that actually wants to use your vertex data.
In D3D v7, Microsoft introduced the concept of Vertex Buffers. These are allocated swaths of GPU memory for storing vertex data.
Want to know when OpenGL got their equivalent of this? Oh, NVIDIA, being a lover of all things OpenGL (so long as they are proprietary NVIDIA extensions), released the vertex array range extension when the GeForce 256 first hit. But when did the ARB decide to provide similar functionality?
Two years later. This was after they approved vertex and fragment shaders (pixel in D3D language). That's how long it took the ARB to develop a cross-platform solution for storing vertex data in GPU memory. Again, something that hardware T&L needs to achieve maximum performance.
One Language to Ruin Them All
So, the OpenGL development environment was fractured for a time. No cross-hardware shaders, no cross-hardware GPU vertex storage, while D3D users enjoyed both. Could it get worse?
You... you could say that. Enter 3D Labs.
Who are they, you might ask? They are a defunct company whom I consider to be the true killers of OpenGL. Sure, the ARB's general ineptness made OpenGL vulnerable when it should have been owning D3D. But 3D Labs is perhaps the single biggest reason to my mind for OpenGL's current market state. What could they have possibly done to cause that?
They designed the OpenGL Shading Language.
See, 3D Labs was a dying company. Their expensive GPUs were being marginalized by NVIDIA's increasing pressure on the workstation market. And unlike NVIDIA, 3D Labs did not have any presence in the mainstream market; if NVIDIA won, they died.
Which they did.
So, in a bid to remain relevant in a world that didn't want their products, 3D Labs showed up to a Game Developer Conference wielding presentations for something they called "OpenGL 2.0". This would be a complete, from-scratch rewrite of the OpenGL API. And that makes sense; there was a lot of cruft in OpenGL's API at the time (note: that cruft still exists). Just look at how texture loading and binding work; it's semi-arcane.
Part of their proposal was a shading language. Naturally. However, unlike the current cross-platform ARB extensions, their shading language was "high-level" (C is high-level for a shading language. Yes, really).
Now, Microsoft was working on their own high-level shading language. Which they, in all of Microsoft's collective imagination, called... the High Level Shading Language (HLSL). But their was a fundamentally different approach to the languages.
The biggest issue with 3D Labs's shader language was that it was built-in. See, HLSL was a language Microsoft defined. They released a compiler for it, and it generated Shader Model 2.0 (or later shader models) assembly code, which you would feed into D3D. In the D3D v9 days, HLSL was never touched by D3D directly. It was a nice abstraction, but it was purely optional. And a developer always had the opportunity to go behind the compiler and tweak the output for maximum performance.
The 3D Labs language had none of that. You gave the driver the C-like language, and it produced a shader. End of story. Not an assembly shader, not something you feed something else. The actual OpenGL object representing a shader.
What this meant is that OpenGL users were open to the vagaries of developers who were just getting the hang of compiling assembly-like languages. Compiler bugs ran rampant in the newly christened OpenGL Shading Language (GLSL). What's worse, if you managed to get a shader to compile on multiple platforms correctly (no mean feat), you were still subjected to the optimizers of the day. Which were not as optimal as they could be.
While that was the biggest flaw in GLSL, it wasn't the only flaw. By far.
In D3D, and in the older assembly languages in OpenGL, you could mix and match vertex and fragment (pixel) shaders. So long as they communicated with the same interface, you could use any vertex shader with any compatible fragment shader. And there were even levels of incompatibility they could accept; a vertex shader could write an output that the fragment shader didn't read. And so forth.
GLSL didn't have any of that. Vertex and fragment shaders were fused together into what 3D Labs called a "program object". So if you wanted to share vertex and fragment programs, you had to build multiple program objects. And this caused the second biggest problem.
See, 3D Labs thought they were being clever. They based GLSL's compilation model on C/C++. You take a .c or .cpp and compile it into an object file. Then you take one or more object files and link them into a program. So that's how GLSL compiles: you compile your shader (vertex or fragment) into a shader object. Then you put those shader objects in a program object, and link them together to form your actual program.
While this did allow potential cool ideas like having "library" shaders that contained extra code that the main shaders could call, what it meant in practice was that shaders were compiled twice. Once in the compilation stage and once in the linking stage. NVIDIA's compiler in particular was known for basically running the compile twice. It didn't generate some kind of object code intermediary; it just compiled it once and threw away the answer, then compiled it again at link time.
So even if you want to link your vertex shader to two different fragment shaders, you have to do a lot more compiling than in D3D. Especially since the compiling of a C-like language was all done offline, not at the beginning of the program's execution.
There were other issues with GLSL. Perhaps it seems wrong to lay the blame on 3D Labs, since the ARB did eventually approve and incorporate the language (but nothing else of their "OpenGL 2.0" initiative). But it was their idea.
And here's the really sad part: 3D Labs was right (mostly). GLSL is not a vector-based shading language the way HLSL was at the time. This was because 3D Labs's hardware was scalar hardware (similar to modern NVIDIA hardware), but they were ultimately right in the direction many hardware makers went with their hardware.
They were right to go with a compile-online model for a "high-level" language. D3D even switched to that eventually.
The problem was that 3D Labs were right at the wrong time. And in trying to summon the future too early, in trying to be future-proof, they cast aside the present. It sounds similar to how OpenGL always had the possibility for T&L functionality. Except that OpenGL's T&L pipeline was still useful before hardware T&L, while GLSL was a liability before the world caught up to it.
GLSL is a good language now. But for the time? It was horrible. And OpenGL suffered for it.
Falling Towards Apotheosis
While I maintain that 3D Labs struck the fatal blow, it was the ARB itself who would drive the last nail in the coffin.
This is a story you may have heard of. By the time of OpenGL 2.1, OpenGL was running into a problem. It had a lot of legacy cruft. The API wasn't easy to use anymore. There were 5 ways to do things, and no idea which was the fastest. You could "learn" OpenGL with simple tutorials, but you didn't really learn the OpenGL API that gave you real performance and graphical power.
So the ARB decided to attempt another re-invention of OpenGL. This was similar to 3D Labs's "OpenGL 2.0", but better because the ARB was behind it. They called it "Longs Peak."
What is so bad about taking some time to improve the API? This was bad because Microsoft had left themselves vulnerable. See, this was at the time of the Vista switchover.
With Vista, Microsoft decided to institute some much-needed changes in display drivers. They forced drivers to submit to the OS for graphics memory virtualization and various other things.
While one can debate the merits of this or whether it was actually possible, the fact remains this: Microsoft deemed D3D 10 to be Vista (and above) only. Even if you had hardware that was capable of D3D 10, you couldn't run D3D 10 applications without also running Vista.
You might also remember that Vista... um, let's just say that it didn't work out well. So you had an underperforming OS, a new API that only ran on that OS, and a fresh generation of hardware that needed that API and OS to do anything more than be faster than the previous generation.
However, developers could access D3D 10-class features via OpenGL. Well, they could if the ARB hadn't been busy working on Longs Peak.
Basically, the ARB spent a good year and a half to two years worth of work to make the API better. By the time OpenGL 3.0 actually came out, Vista adoption was up, Win7 was around the corner to put Vista behind them, and most game developers didn't care about D3D-10 class features anyway. After all, D3D 10 hardware ran D3D 9 applications just fine. And with the rise of PC-to-console ports (or PC developers jumping ship to console development. Take your pick), developers didn't need D3D 10 class features.
Now, if developers had access to those features earlier via OpenGL on WinXP machines, then OpenGL development might have received a much-needed shot in the arm. But the ARB missed their opportunity. And do you want to know the worst part?
Despite spending two precious years attempting to rebuild the API from scratch... they still failed and just reverted back to the status quo (except for a deprecation mechanism).
So not only did the ARB miss a crucial window of opportunity, they didn't even get done the task that made them miss that chance. Pretty much epic fail all around.
And that's the tale of OpenGL vs. Direct3D. A tale of missed opportunities, gross stupidity, willful blindness, and simple foolishness.

- 101
- 3

- 11,813
- 4
- 37
- 46
-
24Did you have this written up somewhere, or did you write if off the top of your head? – Kristofer Hoch Jun 29 '11 at 19:33
-
38@Kristofer: I don't know if you can call it "off the top of your head" for something that took an hour or so to compose, but I didn't have it written up somewhere. – Nicol Bolas Jun 29 '11 at 19:45
-
3This might be for the wrong reasons but OpenGL is the single reason that makes Counter-Strike 1.6 still so attractive to professional FPS gamers. DirectX has terrible pixel-perfect aiming (atleast all the engines with it, do) and it killed the e-sport possibilities with new FPSs without the (now old) OpenGL. *Disclaimer: I ran more than 400 CS tournaments from 2002 to 2006. – Francisco Aquino Jun 29 '11 at 21:10
-
18@F.Aquino: So you're willing to attribute this to the rendering system that FPS games use, rather than the engine itself? Even though CS is based on Half-Life 1, which is based on Quake? Sorry; not buying it. Granted, I don't buy the premise that new FPSs have no e-sport potential, even though there are plenty of e-sports tournaments centered around newer FPSs. They may not hold the same attraction that CS does to _some_ players, but don't make the mistake of thinking that those players make up all of FPS e-sports. – Nicol Bolas Jun 29 '11 at 21:36
-
165Wow. I don't even care about most of this stuff and it's *still* a great read! – Peter Rowell Jun 29 '11 at 22:48
-
3Awesome history, you just missed a little bit at the start, where Microsoft was part of the OpenGL Architecture Review Board, before leaving to work on DirectX. http://www.opengl.org/about/arb/meeting_notes/notes/minutes_12_94.txt – Clinton Jun 29 '11 at 23:21
-
2@greyfade: GL 4 doesn't appeal to developers anymore than GL 3 did. I stopped where I did because nothing really changed. Yes, GL 4 exposes D3D 11 features, but you could just use D3D 11 to get those. Nothing has changed that has helped or hurt OpenGL's market position. Think of GL 4 as the ARB treading water. – Nicol Bolas Jun 29 '11 at 23:47
-
3@Clinton: I mentioned that Microsoft was once on the ARB. But they didn't exactly "leave to work on DirectX"; they were part of the ARB until around the time OpenGL 2.0 came out. By then, D3D was approaching version 9. Indeed, I wouldn't be surprised if they were part of the reason why the ARB stayed away from shaders for so long. Though I think I'll add a paragraph explaining that speculation. – Nicol Bolas Jun 29 '11 at 23:49
-
4
-
@greyfade: no way!! then I will need a summary of the summary as TheBigO said – Sufendy Jun 30 '11 at 06:13
-
3Fascinating. It's always great to read the history of major software components from someone in the know. – Noufal Ibrahim Jun 30 '11 at 09:05
-
1@Nicol Bolas It could be based on the pac-man engine, it doesn't matter. You won't be able to show me a single new game that you can "camp" a pixel, thus killing the type of precision needed at the extremely competitive level, or better, 1.6 is still well alive and all that came afterwards failing one after the other. The gfx card manufacturers, intel, amd, monitor manufacturers, they all pushed us to move, still do, people have adapted but it will never feel the same. – Francisco Aquino Jun 30 '11 at 13:16
-
voodoo2, which predates TNT, had dual texture processing units - see http://en.wikipedia.org/wiki/RIVA_TNT – Jonathan Graehl Jun 30 '11 at 20:36
-
5@F.Aquino: My point was that you're misassigning blame. The API used to render has _nothing_ to do with aiming precision. It is the _engine_ code that allows this. For whatever reason, Quake1's engine allowed you to do this, while other engines don't. If game developers wanted to allow players to "camp a pixel" (whatever that means), they would code it into their current engines. – Nicol Bolas Jun 30 '11 at 23:58
-
Wow, that was an impressive write up. I do have a question though; why are you still using OpenGL if it seems to have so much problems? Is it due to cross-platform compatibility? – DMan Jul 03 '11 at 03:23
-
11@DMan: Cross-platform compatibility is pretty much the only real strength OpenGL has over D3D; in most other repsects, they're close enough to not matter much. Also, most of OpenGL's problems are in the past; the problem is that the past is often why people use something in the present. And that's what my article was showing: how screwups in the past influenced people to pick D3D. – Nicol Bolas Jul 03 '11 at 03:29
-
12`Which they, in all of Microsoft's collective imagination, called... the High Level Shading Language (HLSL).` LOLled for more than 6 minutes at this. – ApprenticeHacker Jul 17 '12 at 04:58
-
2It's a flipping shame that you only have about 1k rep on this site. Stupid community wiki. Amazing summary of the OpenGL/D3D battle. I think Glide had some impact there too (remember 3DFX having the edge on the market before their Sega Dreamcast snafu). Most people presumed they would become the SoundBlaster of the GPU market. – Michael Brown Oct 24 '12 at 20:16
-
2@ApprenticeHacker same thing I did when I read what `hal.dll` was. A machine I was fixing once, had a boot issue that stopped it at loading `hal.dll` I first thought it was a joke on `HAL 9000`, imagine my disappointment. – Hawken Nov 01 '12 at 00:16
-
What's particularly odd about this is that D3D clearly subscribes quite heavily to "worse is better" (whether that's accidental or intentional is an interesting question) whereas the design of OpenGL is very much an "everything *including* the kitchen sink (with extra kitchen sink)" approach, with lots of bloat, legacy cruft and gubbins. Given the Windows vs Unix heritage of each, one would almost expect the opposite to have been the case, yet it's not. – Maximus Minimus Dec 04 '12 at 16:27
-
1by far the best answer in stackoverflow/stackexchange. Though I disagree with the fact that opengl is a failure. It has a very good card , mobile devices where opengl rules supreme making opengl the undisputed king of Graphic libraries. Still you can have all my upvotes. – Kilon Dec 26 '12 at 10:55
-
1This is a truly great answer. I have just one additional request. **Remember WinG?** How does that fit into all this? – user Jan 18 '13 at 10:54
-
@MichaelKjörling: [WinG was really before all of this](http://en.wikipedia.org/wiki/WinG#cite_ref-3). It was more of a precursor to DirectDraw than D3D. – Nicol Bolas Jan 18 '13 at 18:51
-
1Do you think Valve will succeed with their Steam Box if it runs Linux given all the disadvantages of OpenGL you just mentioned? – Abdulsattar Mohammed Jan 27 '13 at 17:51
-
2I need to put a few things straight which you left out. the OpenGL ARB did not survive in it's OpenGL 2.0 to 3.0 form. It was a messy ursurption of many dissatisfied board members which ultimately ended with a much more capable and agile ARB. OpenGL by no means is dead. It is used on every mobile and on all macs and linux in existence, which owes partly to the fact that out of the ashes came OpenGL ES 2.0. – Florian Bösch Jan 27 '13 at 22:00
-
@FlorianBösch: I never claimed that OpenGL was "dead". As for any "messy ursurption" that may or may not have happened, you should provide some evidence of that. Most of that stuff would be internal to the ARB, so it's not likely we will know what was going on behind closed doors. I only stated what is verifiable information. The ARB was adopted by the Khronos group; that's verifiable; whether this was "ursurption" or not is s different matter. – Nicol Bolas Jan 27 '13 at 22:21
-
1It is a great read, but I think is a little off-topic, as it really does not really go to the original question. – Khelben Jan 28 '13 at 10:34
-
1@NicolBolas: you think you can add year numbers into context? Not everyone knows which hardware was releases when, etc. I think it would be an even greater addition. Thx for considering! – mark Jan 28 '13 at 17:46
-
1I registered just to congratulate the guy who took the time to write this as part of a response. – csotiriou Jun 22 '13 at 10:59
-
@asattar This was back then. It's history now. **Nowadays, OpenGL is better than DirectX. See http://blog.wolfire.com/2010/01/Why-you-should-use-OpenGL-and-not-DirectX for lots of reasons you should use OpenGL.** – jobukkit Jul 16 '13 at 13:52
-
@Jop: Considering how much misinformation is contained in that obvious propaganda piece, I would suggest avoiding that article. How much faster OpenGL's draw calls are than D3D10/11's is very debatable these days; the article cites an NVIDIA PDF from ***2006***. Saying that Microsoft leaving the ARB is part of a "FUD campaign" is an outright lie. And the rest of the "FUD" stuff is alarmist, anti-Microsoft BS. D3D 9 came to dominate the gaming landscape all on its own, well before the Vista release. – Nicol Bolas Jul 16 '13 at 18:28
-
@Jop: This answer was written in 2011 and that article was written in 2010. – Abdulsattar Mohammed Jul 17 '13 at 06:57
-
@asattar Ok, here's something from 2012: http://blogs.valvesoftware.com/linux/faster-zombies/ – jobukkit Jul 17 '13 at 07:12
-
@Jop: ... and? The difference between 270FPS and 303FPS is approximately... 0.4 milliseconds. That's *barely* more than a rounding error, and in a game that was actually *using* the hardware (rather than one that throws away 4 out of every 5 frames), it would be an insignificant difference. In short, the performance of D3D nowadays is reasonably comparable to the performance of OpenGL. – Nicol Bolas Jul 17 '13 at 07:20
-
@Jop: Also, I would like to point out that Valve gets to *cheat*, because they can basically say to the IHVs, "here's how we're going to render; go make this optimal in your drivers." Other programs don't get to say that. The article you mention even points this out, saying that their work has caused driver changes. They *claim* that this "benefits all games", but the reality is that it only benefits games that render the way that they do. – Nicol Bolas Jul 17 '13 at 07:22
-
In 2010-2013, DX is falling due to falling of Windows on mobile. And GLES is arising by Apple and Google. – Eonil Aug 13 '13 at 23:54
I found it strange that everybody's focusing on user base, when the question is 'game developers', not 'game editors'.
For me, as a developer, Linux is a bloody mess. There are so many versions, desktop managers, UI kits, etc... If I don't want to distribute my work as open source, where the user can (try to) recompile so it fits his unique combination of packages, libraries and settings, it's a nightmare!
On the other hand, Microsoft is providing (most of the time) incredible backward compatibility and platform stability. It is possible to target whole range of machines with one closed-source installer, for instance computers running Windows XP, Vista and 7, 32 and 64 bits flavors, without proper DX or VC redistributables installed, etc...
One last thing, PLEASE EVERYBODY ON THE INTERNET STOP COMPARING OPENGL AND DIRECTX! Either compare Direct3D vs OpenGL or don't do this. DirectX provides input support, sound support, movie playing, etc etc that OpenGL doesn't.

- 674
- 1
- 4
- 11
-
40"For me, as a developer, Linux is a bloody mess." Indeed! I work in an environment with Fedora and Ubuntu. We have problems even between just those two. (I must add I'm a Linux fanboy.) – David Poole Jun 29 '11 at 14:58
-
48@jv42: This seems a common misconception: Almost all of those versions and desktop managers and UI kits, etc. are pretty irrelevant to getting a game up and working. If your game (as shipped) depends on more than libGL, libX11, and libasound (or libopenal or libao or liboss), you're doing something wrong. – greyfade Jun 29 '11 at 22:00
-
1If you just ship a Linux binary that works well and your game/program/whatever is popular, most likely Linux users will take care of the rest. You get bonus points if you ship a 64 bit version as well. – Gerardo Marset Jun 30 '11 at 00:53
-
2@greyfade nice thing to say, but I'm still waiting for a single game on linux with high production values (really, I am, I'd love to be able to play games and give up windows). So far I haven't seen any. Another key point about windows is that the users will (sometimes) *pay* for the games. I really don't know anyone who pays for any software on linux (and I don't either). But I do pay for video games. – TM. Jun 30 '11 at 03:45
-
1@TM: I don't understand why you're singling me out - I'm not saying anything beyond the fact that developers seem to be confused about what libraries actually matter for writing games. That said, I've not seen a Linux user that didn't pay for the native games they play, when given the chance. – greyfade Jun 30 '11 at 06:03
-
19@greyfade "If your game (as shipped) depends on more than libGL, libX11, and libasound (or libopenal or libao or liboss), you're doing something wrong." -- And then you link to libao-0.0.4alpha, and your user has libao-0.0.5beta and nothing works. – quant_dev Jun 30 '11 at 08:34
-
1Please post your answer in multiple responses, so we can upvote more than one time. – dombesz Jun 30 '11 at 12:37
-
@quant_dev: `libao` was a bad example: it's GPL, so you couldn't use it in a commercial game anyway. But regardless, if you depend on an alpha version, that's your fault - ship it with your game. I'd say the same for any other LGPL dependency. – greyfade Jun 30 '11 at 15:56
-
4@TM: I don't understand. I ran UT, UT2004, and Quake 3 on Linux, and they ran just great! (OK, UT2004 had a problem where it would run slower and slower on each map change, and I eventually just rebooted into Windows, but...) I miss those days. – David Krider Jun 30 '11 at 16:43
-
2@quant_dev: That's the thing. If libX depends on libY and on libZ, then consider you've made a poor choice. Most developers seem to get on well with shipping `libSDL` and `libopenal` and using the system `libasound` and `libGL` with several other things statically compiled and no other dependencies to speak of. If you've chosen a library that depends on a whole bloody distribution, you're doing it very, very wrong. Actually, I'd say the same thing if you were on Windows. – greyfade Jun 30 '11 at 18:46
-
17Packaging up a binary so that it runs on all Linux distributions isn't that hard. Heck, the Blender developers do it with each release. There's only the one (well two, 32 bit and 64 bit) Linux release package of Blender. – datenwolf Jul 05 '11 at 15:44
-
8Linux users pay more for games. See: http://www.koonsolo.com/news/linux-users-show-their-love-for-indie-game/ and http://2dboy.com/2009/10/26/pay-what-you-want-birthday-sale-wrap-up/ – Alexander Jul 07 '11 at 22:37
-
3@TM I pay for games on Linux. Ever heard of the Humble Bundle? I never pay less than $10. If Steam and games ever get ported to Ubuntu (a possibility Valve is starting to explore), I will pay for AAA games as well. Yes, AAA prices for AAA games on Linux. Us Linux fanboys aren't cheapskates. – Andres F. Aug 02 '12 at 17:59
-
1Saying Linux is a mess due to all the versions is wrong. If you are coding for Windows then it's no different than coding only for Ubuntu. You would have the same stability of a single distro as you do for the "single distro" called Windows. – Rob Nov 16 '12 at 13:25
-
2@Rob you fail to make a point there. There are indeed many distros for Linux, if you target only one of them, you can't say your game 'runs on Linux' now can you? And the stability of Windows as a target platform is top notch, you can run many 20+ years old games on the latest Windows. – jv42 Nov 16 '12 at 16:23
-
2@jv42 - You miss my point. His complaint is that each distro is different so Linux is hard to target and a mess. My point is he's targeting just one "distro" called Windows and he can accomplish the same thing by targeting just one "distro" called Ubuntu or whatever his preferred distro would be. He is claiming Linux is a mess and the problem when it is not. – Rob Nov 16 '12 at 17:30
-
@Rob My point is: without in depth knowledge of Linux, the perception of it is a real mess: there are many distros, with different package managers, preferred UI kits, etc... which fail to provide a clear path when one wants to develop on it. The reality is not so bleak, but that perception is still based on facts. – jv42 Nov 17 '12 at 09:11
-
jv42, you never actually tried to deploy something on linux, did you? it’s dead easy, as @greyfade said: ship your game with the known working versions of all libs except some system libs like libasound and libGL, and everything will work forever. – flying sheep Jan 03 '13 at 21:07
-
@flyingsheep Please read the question and my comments. We're not talking about facts but perception. – jv42 Jan 04 '13 at 08:33
-
@flyingsheep And in fact I'm currently working again with Linux, and it's messier than I thought when I was distanced from it. For instance, 32 bits software on 64 bits OS is complicated on Linux (as in: not always working out of the box) vs Windows where WOW takes care of the details for you. – jv42 Jan 04 '13 at 08:34
-
i’m using linux exclusively for ~4-5 years and not once have i stumbled upon a 32bit-vs-64-bit issue. stuff just works out of the box, that’s my honest experience. – flying sheep Jan 04 '13 at 15:47
-
@flyingsheep As a developer? Also, I've just had issues using a prebuilt package (Firefox official!) on a 64 bits system. – jv42 Jan 04 '13 at 15:51
-
yes. i use the fallback sequence: 1. get from official repos, 2. get 3rd-party packages for your packaging system, 3. get tarball and create package from it 4. repeat all steps with 32 bit version. i currently have 2 32 bit games on my pc (i used to have more) – flying sheep Jan 04 '13 at 18:02
-
Sorry for replying to a 2 year old post but: Nonsense. Porting a game to Linux is _very_ easy if you use a good cross-platform game engine, and it will work on most editions of Linux available. – jobukkit Jul 16 '13 at 13:48
It's because there are more Windows users on the planet than Linux and Mac. The truth is that people make things for whichever has the biggest market.
The same goes with mobile phones: Android and iPhone have awesome games but Windows Mobile and Symbian don't...

- 255
- 2
- 15

- 517
- 3
- 7
-
2It has nothing to do with adoption, PCs can run openGL and Linux just as they can run windows. – Mahmoud Hossam Mar 21 '11 at 23:01
-
24@Mahmoud Hossam: That's not true. Using OpenGL doesn't mean that your entire application will just magically work under a *nix environment, it means that graphics engine will (and even then it doesn't mean there aren't quirks on (for example) Windows that don't exist on Linux and vice versa). That's not the whole basket and there is definitely an appreciable cost of maintaining their code for multiple platforms. – Ed S. Mar 21 '11 at 23:16
-
2@Ed Nothing works "magically", portability comes at a cost, always. – Mahmoud Hossam Mar 21 '11 at 23:18
-
5@Mahmoud: Exactly. And if you're not going to bother porting to Linux/Mac OS anyway, then why not use the native library? DirectX is *much* better supported on Windows than OpenGL is. – Dean Harding Mar 21 '11 at 23:20
-
-
10@Mahmoud: the whole premise of the question is "why do developers prefer Windows". Answer: because that's what gamers use. There's no point porting to Linux/Mac if it only makes up 2% of your market, and in that case, there's no point using OpenGL if you're not going to port anyway. – Dean Harding Mar 22 '11 at 02:11
-
2@Dean gamers use windows because they have to, if developers switched to whatever platform there is, gamers will follow. – Mahmoud Hossam Mar 22 '11 at 02:22
-
1@Mahmoud: Sure, it's Catch-22. I'm not saying Windows is *better* or anything like that. But if the question is "why do game developers use Windows", then answer is "because that's where the gamers are". There's no technical reason why you can't develop a game for Windows/Mac/Linux. – Dean Harding Mar 22 '11 at 02:59
-
4@Mahmoud, its not the game developer's job to get people to switch to Linux. – GrandmasterB Mar 22 '11 at 03:57
-
@Dean actually, I think gamers choose windows because developers develop games for it, but what you said is also correct :-) – Mahmoud Hossam Mar 22 '11 at 03:59
-
@GrandmasterB it's his job, but he's capable of doing so, I've been a gamer for quite some time and I know how gamers think. – Mahmoud Hossam Mar 22 '11 at 04:03
-
2"the same goes with mobile phones: android and iphone have awesome games but windows mobile and Symbian don't..." Well, first off Symbian till has a way bigger market share than Android and iPhone, so that goes against your theory. Secondly, WP7 has access to the whole XBLA collection, which is pretty impressive – pdr Mar 22 '11 at 10:54
-
this answer is simply about how many people use what? its difficult to do business in niche market. so if u have an option to reach more people y not choose that... this answer is about business not about the technologies used. do you know some of the awesome games for windows cal also be run under WINE on linux... – Arjun Bajaj Mar 22 '11 at 16:30
-
2@Mahmoud You are a gamer, but you are also a techie. That is why you **think** gamers would follow if games started being developed for Linux. You understand Linux and probably care for it, most gamers would abandon PC gaming and go to an xbox or playstation. – Marcelo Jun 29 '11 at 13:55
-
1@Marcelo: Most gamers *have* abandoned PC gaming and gone to xbox or playstation. – John Bartholomew Jun 29 '11 at 19:34
-
-
6@John 10+ million of World of Warcraft players would like to have a word with you. And this is just one game. – Marcelo Jun 29 '11 at 19:38
-
I've heard the Windows version of World of Warcraft runs smoother on Linux than on Windows (using Wine, the native Win32 API reimplementation). – Alexander Jul 07 '11 at 22:39
Because Windows has over 90% market share, and Linux (since you specifically asked about Linux) has a reputation for having lots of users who don't like to pay for software. Whether or not that's true or how true it is is irrelevant; the perception is there and it influences people's decisions.

- 82,151
- 24
- 234
- 309
-
1If developers use OpenGL it will support both Windows and linux so it will be marketing advantage actually to attract Linux users who are willing to pay and are using Linux because they believe it's better. – M.Sameer Mar 21 '11 at 22:56
-
9Even using OpenGL there are costs in developing and testing cross platform which the Linux market doesn't justify. Directx is also (sadly) a better platform for PC gaming at the moment than OpenGL - there are very few new games on PCs being built on OpenGL. – Martin Beckett Mar 21 '11 at 23:10
-
25Cross platforming isn't as straightforward as "just code for OpenGL". – System Down Mar 21 '11 at 23:11
-
13That's not true actually, that Linux users won't pay for software. The Humble Indie Bundle (a bundle of games you can get for any amount of money you wish to pay) has been done twice now and every time it showed Linux users paying more than Windows users for the bundle. – Htbaa Mar 22 '11 at 08:36
-
9@Htbaa Of course they paid more, they were desperate, its probably the only games they get to play on their OS. – Marcelo Jun 29 '11 at 13:59
-
1@Htbaa True, but it's also pretty telling that the majority of the money made was from Windows users. They is simply more of them. – Philip Jun 29 '11 at 14:44
As some have already said, the most important part is the user-base. 95% of PC users use Windows. PC gamers use almost exclusively Windows. Even these who use Mac or Linux most often run Windows games through some virtualization or emulation (with very, very few exceptions).
But demographic is not everything. I wouldn't underestimate the part that Microsoft is doing to make the platform more attractive for game developers. Basically you get fully featured set of tools for free, most importantly XNA Game Studio. This allows not only development for Windows, but also for Xbox360. And with the latest edition even for WP7 phones. Obviously since it's Microsoft tool, it uses DirectX, not OpenGL.

- 20,760
- 1
- 52
- 98
-
3Note for any time-travelling readers: As of April 2014, XNA will be officially dead. The last release (4.0) was published in 2010 and won't be seeing a new version between now (2013) and it's sunset. – Aren Feb 09 '13 at 02:11
-
1Another note to any time-traveling readers: Windows 8 (and up) development [will no longer be free](http://arstechnica.com/information-technology/2012/05/no-cost-desktop-software-development-is-dead-on-windows-8/) in the future. – fouric Apr 01 '13 at 20:40
Because Windows is backed by a huge organization, that more than a decade ago decided they want game development to happen on their platform.
This wasn't true for the Mac, and it isn't true now. Not even for iOS. Apple doesn't provide tools for iOS game development. But it's a huge market (there's more iPhones out there, than there was PCs in 1995) with relatively little competition, so people do it anyhow.
As for Linux, there's not even some sort of central institution, that could set any sort of priorities. The direction in which Linux is going, is more less determined by a bunch of very good, yet slightly unworldly programmers.
To create a PC game today, you need a lot of 2d/3d artists, game designers, scriptors, actors, testers and what not. As to the actual programming, you might simply use an actual game engine (CryEngine, Unreal Engine, Quake Engine, Source Engine). So you might be able to do the whole thing without any actual programmers.
Because of that, and because of the nature of businesses, programmers have little say in which platform is chosen. And typically, managers look for support, which is something Microsoft claims to offer, and to deal with things that are somehow seizable to their thought patters, which open source is not.
For that reason, most commercial end-user software development is done on windows.
I work for a company, that creates flash games, and is thus not bound to a particular platform. However, we all develop on windows, because most of the tools we use aren't available for Linux.

- 29,980
- 3
- 73
- 114
-
2"So you might be able to do the whole thing without any actual programmers." You forgot the sarcasm. – Allon Guralnek Oct 13 '11 at 20:00
-
@AllonGuralnek: No sarcasm there. In classic game development of big games, programmers will create engines and means to provide content and behavior (through visual editors or actual scripting) and then game/level/mission designers will use those means. If you buy a working and sufficiently powerful engine, you can basically cut out step one. – back2dos Oct 13 '11 at 20:09
-
2Do you have a specific example of a reasonably notable game that was created without writing a single line of code? – Allon Guralnek Oct 13 '11 at 20:15
-
1@AllonGuralnek: I didn't say anybody would be creating games without *code*, but without *programmers*. You don't need to be a programmer to create an entirely game-changing mod. [DotA](http://en.wikipedia.org/wiki/Defense_of_the_Ancients) is the most popular one I can think of off the top of my head. – back2dos Oct 13 '11 at 21:21
-
People who write code are programmers. You don't have to be an expert - if you've written a program that someone else finds useful, you're a programmer as far as I'm concerned. Also, DotA (for Warcraft III), is neither a mod nor game-changing. It's simply a map, with the same graphics, gameplay mechanics, controls and rules as the original game. It simply introduced certain constraints, narrowed the scope by removing some mechanics, changed the configuration of some elements and repurposed others, which happened to be appealing to many and became a popular sub-sub-genre (like Tower Defense). – Allon Guralnek Oct 13 '11 at 21:48
-
1@AllonGuralnek: No. People who write code are people who write code. Level designer are required to have a certain understanding of scripting. Much programmers are often required to have a certain amount of management skills. It doesn't make the first one a programmer, nor the second one a manager. Also your assessment of DotA is wrong. Firstly it's entirely game-changing, turning an RTS into a new [genre](http://tinyurl.com/3rfd3wx) and secondly it is considered as a separate game by many eSports leagues [including the ESWC](http://tinyurl.com/6bc4xly) – back2dos Oct 13 '11 at 22:07
Ewwww, I don't. I use Linux almost exclusively. I dual-boot to Windows to make Windows builds, and use the Mac for the Mac builds, but that's it.
The trick is a cross-platform framework we've developed over the years. Our games are built on top of that, and behave identically in Linux/OpenGL, Mac/OpenGL, and Windows/Direct3D (and soon in iOS/OpenGL).
Admittedly my company doesn't do AAA titles, so it may not apply to these, but we do make top casual games (see website - CSI:NY, Murder She Wrote and the two upcoming 2011s are examples of titles using important licenses, The Lost Cases of Sherlock Holmes 1 and 2 were quite successful as well)
I wouldn't give up gedit+gcc+gdb+valgrind for anything else.

- 1,204
- 9
- 8
-
3
-
2
-
3Sorry, I did give up gedit in the end. I now use gvim and I'm much happier :) – ggambetta Jan 29 '13 at 10:17
The answer is obvious. The objective of writing a game is to make money. More end users run Windows, therefore there is a bigger market and you would expect to make more money from a Windows game than a Linux game. It's that simple.
If ever you ask yourself the question 'Why does someone do...', just remember that money makes the world go round.

- 1
- 1
-
1Yes but you can make a cross platform game and get more than just windows users ;) – M.Sameer Nov 16 '12 at 11:39
-
Being cross-platform isn't everything. If the number of extra users you get is a relatively low percentage then you need to balance the extra development cost and ongoing support cost against what extra you're going to get in from them and make an informed decision based on actual hard data. There's no globally right or wrong answer to that one, but there is an answer that's right or wrong for each individual program, and what's right for one program may be wrong for another. – Maximus Minimus Dec 09 '12 at 16:04
Tools, tools, tools.
That's what it comes down to. Develop on Windows and you get access to some of the best development tools on the planet. Nothing comes even remotely close to Visual Studio's debugger, the DirectX Debug Runtimes are awesome, PIX is awesome, and comparable equivalents just don't exist on other platforms/APIs. Sure, there is some good stuff there; I'm not saying that the tools on other platforms are bad, but those that MS provide are just so far ahead of the pack (honourable exception: Valgrind) that it's not even funny.
Bottom line is that these tools help you. They help you get stuff done, they help you be productive, they help you focus on errors in your own code rather than wrestle with an API that never quite behaves as documented.

- 1,498
- 10
- 11
-
PIX *is* awesome. Debugging shaders by clicking a bothersome pixel and seeing what happened is great! – Chris Pitman Nov 17 '12 at 15:54
So I've gone over all these answers, and as a game developer who has code on console games that have been on Walmart shelves, I have a very different answer.
Distribution.
See, if you want to be on a Nintendo console, you have to get Nintendo's permission, buy from Nintendo's factories, pay Nintendo's overheads, negotiate with Walmart, deal with warehousing, you need money up front to manufacture, to print boxes, to ship, to do all the insurance, et cetera.
If you want onto the XBox, sure there's XBLA, but you still need Microsoft's blessing, you have to wait your turn in line, it's tens of thousands of dollars just to release a patch, etc.
On iOS, you still need Apple's okay, and they can (and do) capriciously pull you.
On Steam, you still need Valve's permission or greenlight, and lots of money.
.
On Windows? You set up a website and a download button.
.
I'm not saying the other platforms aren't valuable. But there is so *much* horrible stuff going on when you're trying to develop a game, that to me, the promise of being able to just slap a binary on a site and focus on the work - at least to get started - really lowers a lot of potential failure barriers.
"We can do the XBLA port later, when things are stable" mentality.
And to a lesser extent sure this is fine for Linux too, and if seven customers is good, you can start there.
But Windows has three massive advantages: genuinely open development, genuinely open deployment, and a very large, very active customer base which is interested in quirky stuff.
It's hard to imagine where else I'd rather start.

- 101
- 3
- Inertia. If you've used Windows in the past then switching to something else is a hassle. If you're on Windows DirectX is easier and more likely to work well than OpenGL.
- Market share. The market share of Windows on the desktop is bigger than that of OS X which in turn is bigger than that of Linux. Money talks.
- Brand. DirectX is better known than things like SDL (which is the sort of thing you would need to replicate some of DirectX's features that go beyond OpenGL).
- Less confusion. Will the user's Linux only support up to OpenGL 1.4 or OpenGL 2+? Can you use an OpenGL 2.0 tutorial like An intro to modern OpenGL. Chapter 1: The Graphics Pipeline on your version Linux?
These days Linux is more of a curio when it comes to games development and most developers would be better off fiscally doing an OS X port version before a Linux version (see things like Steam). Even then the console market is worth more than these two platform combined for games...
If you wanted to mono platform DirectX is fine. If you want to be cross platform there's a strong chance you will have to go with OpenGL on at least some of the other platforms.
-
2The answer to question 4 is: 2+. Mesa3D supports up to OpenGL 2.1, which means that all graphics drivers for 3D hardware for X11 support at least OpenGL 2.1 since Mesa version 6.8. That covers all Linux distros released in the last couple years, and binary drivers that NVIDIA and AMD ship support 3.0 and up. This doesn't include users of the intel895, but they've been deprecated. – greyfade Jun 29 '11 at 23:34
-
1I'm afraid this is not the case. Computers with i915/i945 (GMA 900/950) chipsets are (still) being sold and are not deprecated. Even on modern distributions from a few months ago (Ubuntu 11.04/Fedora 15) glxinfo will return an OpenGL version no higher than 1.4 (Mesa 7.11-devel). Such Intel hardware simply _can't_ do later versions without software help so until softpipe is available by default, Linux on such cards will never do a higher version OpenGL. Targeting OpenGL 2.0 (or higher) will stop a program running on a wide range of Linux desktop systems... – Anon Jul 19 '11 at 06:50
-
Even so, I fail to see the concern. Those same chipsets have limitations on Windows, so most graphics-heavy games would be out of the question *anyway.* – greyfade Jul 19 '11 at 15:05
I think you should read more about the History of DirectX and This Article.
I think MS chose DX over openGL because they like to lock people into using their own OS.

- 3,817
- 2
- 29
- 41
-
4no, they created DX because OGL isn't optimised for Windows, can't be expanded quickly to take advantage of new hardware and operating system developments, doesn't incorporate sound, control input, etc. etc.. – jwenting Mar 22 '11 at 07:44
-
2iow DX is specialised for Windows for performance reasons and to allow Microsoft to keep it that way and quickly incorporate new technology as it becomes available. OpenGL is stuck at the level of graphics hardware support that existed 5 years ago, maybe more, because it's a comittee effort. – jwenting Mar 22 '11 at 07:46
-
1@jwenting I don't think performance was the only reason they didn't port it to other platforms, I mean, if they wanted to port it, at least they would've open-sourced it and left it to the community to implement it. – Mahmoud Hossam Mar 22 '11 at 12:18
-
I didn't say performance was the reason to not port DX, it was the reason to create it in the first place. And no, Microsoft isn't going to open source it, why should they? The OSS community is so hostile towards them, they'd be mad to want to have anything to do with them, the risk of deliberate sabotage is just too great. – jwenting Mar 22 '11 at 17:50
-
@jwenting they open sourced C#, the future of the whole .NET platform, why would they do that then? – Mahmoud Hossam Mar 22 '11 at 18:40
-
1they didn't open source C#, they submitted the language specification to standards bodies. Microsoft is a member of those standards bodies, and does such things all the time (they have large contributions to the html, javascript, and other standards for example). Open sourcing it would have meant releasing the source code for the compiler and libraries under something like the APL. – jwenting Mar 23 '11 at 08:08
-
@jwenting okay, maybe I misstated what I mean, why didn't they submit a standard for the graphics API? – Mahmoud Hossam Mar 23 '11 at 13:37
-
because it's a product, not just an API. Without the binary libraries Microsoft provides, DirectX is nothing. So there's no DirectX for other platforms unless Microsoft provides the runtime for it, which they don't. They'd have to open source a bunch of DLLs to go with it, which are highly operating system specific. – jwenting Mar 23 '11 at 20:09
-
@jwenting yes, maybe because it's too dependent on the underlying OS, thank you for the clarification. – Mahmoud Hossam Mar 23 '11 at 20:53
-
Also not that the specification for C# doesn't include a lot of commonly used libraries in the .NET runtime, causing possible patent infringements. – alternative Jun 29 '11 at 20:46
-
1@jwenting: For the record, there is an implementation of Direct3D 10 and 11 on Linux that targets the Gallium3D framework. (And it has nothing to do with Wine.) I would also disagree with your assertion that "OGL isn't optimised for Windows." That's an issue of hardware driver implementations, not a limitation of OpenGL. – greyfade Jun 29 '11 at 23:30
A lot has to do with politics and control. In the late 90s, SGI and MS actually agreed to combine efforts:
http://en.wikipedia.org/wiki/Fahrenheit_graphics_API
SGI invested heavily in the project, MS did not. SGI needed MS more than MS needed SGI. The rest is history.
D3D and OpenGL are two very different APIs, it is up to the developer to choose which is right for your needs.

- 101
- 3
Simply because Linux failed horribly as a desktop system. As somebody pointed out earlier Linux is a mess for developers (different libraries, ui toolkits, etc)
Another problem is freetardism and the lack of support for proprietary software. Nvidia always provides fine (proprietary) drivers for Linux however Ubuntu and other distros are not shipping it. There is also no binary driver interface available for Linux as for Windows. (There is a text file called binaryApiNonsense.txt or something in the Kernel sources) Having said that only Nvidia hardware is properly supported under Linux. You can play most of ID softwares games using Nvidia hardware under Linux.
Next thing development tools. MSFT provides excellent C++ support and the Visual Studio debugger is better then gdb with regard to C++. Last but not least, further tools are missing such as Photoshop. Also .net allows you to quickly create gui tools. Many game studios code their tools for internal use using the .net framework.
I almost forgot: The graphics system is horribly, back in the days they just ported X11 over because it was the easiest thing which worked. They failed to properly design and implement a modern graphic system which OSx and Win have.

- 556
- 2
- 5
- 13
-
4Hmm? My intel and ati cards work fine in Linux... And whats wrong with X11? I always felt it was much better than the alternatives for other OS's (particularly Windows) because of its client-server architecture. – alternative Jun 29 '11 at 20:25
-
No they don't type glxinfo and see weather it's fine or not! X11 is horrible broken just an example http://theinvisiblethings.blogspot.com/2010/08/skeletons-hidden-in-linux-closet.html – Nils Jun 29 '11 at 20:28
-
and if you read the last sentence, Linus himself implemented the _kernel level patch_ - not an X11 patch. – alternative Jun 29 '11 at 20:33
-
1I do not know much about graphics system (you may have a point) but saying that Linux failed as a desktop system is not true or at least not accurate. I moved from Windows to Linux and I feel much productive ever since. Linux's performance is and was superior to Windows in all machines I used. I also do not code C++ and I want to know where does Eclipse stand as a C++ IDE as compared to Visual Studio. – M.Sameer Jun 29 '11 at 20:43
-
Put down the flamethrowers. I think the generally accepted opinion is that VS is the best IDE. However, Eclipse is still young and has way to go - we shall see. I simply love how easily everything is scriptable and pipeable in Linux, however! – Vorac Aug 31 '12 at 13:31
-
@Vorac, I think VS is a good IDE, but not the *best* for C++ development. If you want something easier to customize which is free, I'd say try Orwell's Dev C++ (Windows only) or Qt Creator (Linux/Mac/Windows). I'd say Code::Blocks too but it appears quite dead these days... – about blank Oct 07 '12 at 06:18
-
"As somebody pointed out earlier Linux is a mess for developers". At least you don't have to [pay to develop for Linux](http://arstechnica.com/information-technology/2012/05/no-cost-desktop-software-development-is-dead-on-windows-8/). And why would you need an IDE when [Linux _is_ an IDE](http://blog.sanctum.geek.nz/series/unix-as-ide/)? – fouric Jan 28 '13 at 18:11
-
On the ethical side of things: "Another problem is freetardism". Don't you understand? This isn't about Windows vs. Mac vs. Linux. This is about the ability to use your computer in any way you want. Microsoft is actively working to _take control of your computer_ using Windows 8. You can only install Windows Store apps from the Microsoft-approved Windows Store, and the freedom to install any desktop apps (or the desktop interface itself) will probably disappear in the near future. Think about that for a moment. If Microsoft gets its way, you won't own your computer, _Microsoft will_. – fouric Jan 28 '13 at 18:27