19

For the last couple of years, all of the serious projects I have worked on have been either web based, or had a non graphical user interface (services, command line scripts etc...). I can throw together a WinForms app or do some simple WPF when needed, but I've never really delved into some of the lower level API's like MFC or QT.

I understand that this depends on the situation but in general is it still worth taking the time to learn desktop development well or are applications moving to the web and mobile devices at a pace which makes this knowledge less relevant? Also, do you expect developers you work with to have desktop gui expertise?

Peter Boughton
  • 4,562
  • 1
  • 30
  • 27
aubreyrhodes
  • 1,143
  • 10
  • 13
  • 5
    Having desktop application development is great, but for the love of Knuth, don't bother with MFC. All that you will need for 95% of Windows desktop app jobs is WinForms or WPF/XAML. The other 5% of the jobs you don't want to have. – Adam Crossland Nov 18 '10 at 18:44
  • 1
    @Adam: +1 for "The other 5% of the jobs you don't want to have." - So true. :) – Bobby Tables Nov 18 '10 at 20:52

7 Answers7

39

I'd say yes, it is. There's sort of a pendulum effect in program development. First everything ran directly on the computer. Then when the computer became powerful enough to run multiple programs, they got mainframes with dumb terminals. But dumb terminals really suck in terms of usability, so as soon as computers got powerful enough to put reasonable amounts of hardware inside a terminal-sized system, we got personal computers, and everything ran directly on the computer.

Then they invented the World Wide Web, and we're back to a mainframe (server) and a dumb terminal (browser.) But dumb terminals still really suck in terms of usability, and people are starting to relearn the lessons of 30 years ago, and we're trending away from that again. A lot of the really hot development these days is for desktop (or mobile) apps that run locally, but are able to connect to the Internet for specific purposes to enhance their functionality.

Mason Wheeler
  • 82,151
  • 24
  • 234
  • 309
  • 6
    +1 for pointing out that these trends run in cycles. However, I've seen a case where a terminal app was re-written as a desktop app, and the users were able to work more efficiently with the terminal app. – Larry Coleman Nov 18 '10 at 18:32
  • 2
    The difference with browsers is that they can in fact run code on the local system, and this capability grows with each browser generation. The consequence of that is that the usability difference between desktop and web apps isn't that big. For many people (including myself) gmail is more usable than outlook. The pendulum swings less far each time, and it will stop halfway, with apps being a mixture of local and cloud parts, regardless of the underlying technology. – Joeri Sebrechts Nov 18 '10 at 18:45
  • 14
    +1, I hate when people start claiming that desktop is dead, it is ridiculous. – dr Hannibal Lecter Nov 18 '10 at 19:21
  • +1. Awesome answer. Never really though of it that way. – Bobby Tables Nov 18 '10 at 20:49
  • 1
    @Joeri: Most terminals could always do at least a few bits and pieces locally. A depressing amount of JavaScript I've seen does things an IBM 3270 (for one example) could have done locally as well. – Jerry Coffin Nov 18 '10 at 20:53
  • 1
    @Joeri Sebrichts - when Gmail allows me to drag and drop an email into a task or calendar item I may agree with you, but until then, way too fewer features. – JeffO Nov 18 '10 at 21:44
  • @Jeff O: but is that a technological restriction or a choice of the gmail team not to implement features they see as adding little value? My point is, platform choice matters less with each successive platform generation, and these native vs. web debates are just going to stop mattering, not because one side "wins", but because both sides will start to look like each other. I am 100% sure sandboxing and background updating are going to be standard features of native dev platforms. – Joeri Sebrechts Nov 19 '10 at 06:19
11

Even if you never intend to do desktop dev, I would suggest you get enough experience that you would have an informed opinion on when it is better to use a desktop solution over a web client.

Bill
  • 8,330
  • 24
  • 52
  • +1: Assuming that 'the desktop is dead' and pigenholing applications is the opposite of pure desktop developers saying "that could never be good as a web app". Pick what you want to work with, but know the other just enough to know the its true benefits/pitfalls. – Steven Evers Nov 18 '10 at 21:09
8

Yes, but not in the way you are thinking.

GUI Programming is not any more difficult nor does it require specialized skills apart from familiarity with the gui programming interface. Hooking up buttons and windows and controls isn't terribly difficult and is pretty easy with modern programming environments compared to the early days with stuff like MFC. GUI programming is stuff that's fairly easy to learn when its demanded.

However, while hooking up buttons and text boxes is fairly easy, knowing when and where to place buttons, and designing a gui to be used by human beings is very difficult. That is a very valuable and important skill to have. However, the design principles that apply to native interfaces vs the web are very similar.

So learn how to design good user interfaces that are effective and don't confuse users, and you'll get familiarity with the programming for them for free.

whatsisname
  • 27,463
  • 14
  • 73
  • 93
  • 2
    Especially in these days **User Experience** is in charge of software design. Architecture is not in charge anymore. – rwong Nov 19 '10 at 06:26
5

It's really going to depend on your situation. I recently worked for a Fortune 500 company who had several projects to convert web applications back into desktop applications (SmartClient/Click-Once). In their particular circumstances it made a lot of sense and eliminated several usability issues their existing apps suffered from.

If you're a full-time employee and your company doesn't generally design desktop apps then it probably doesn't make any sense to be fully up to speed on Winforms or WPF. If, however, you're a consultant and you'd like to be able to offer another service to your clients, then it can't possiblly hurt.

Walter
  • 16,158
  • 8
  • 58
  • 95
4

Hmm, besides GMail, Stack-Exchange and my bank's home banking, I use all the day non-web software. Now with the advent of smartphones and tablets, web application are even less attractive to me (I use my smartphone Facebook client). That's user-side.

Developer-side: in my last 10 years, I worked almost only on non-web software (and my career spanned many very different domains as I worked as a software consultant) and I don't see any future web trend in my job.

So yes, it is still a must learning desktop GUI environments.

Wizard79
  • 7,327
  • 2
  • 41
  • 76
2

Of course "it depends" -- but I think your experience is typical. I have rarely had to create a thick client for any of the applications I have written. Unless there is a specific reason that the client needs to be running on the desktop (connectivity issues or 3D game, etc) -- I believe that it's easier for the developer and admins to maintain one "instance" of the application. If they have the skill set to design a web application they should be OK moving into the desktop app realm generally.

Actually I think it's more important that a thick client developer learn web application programming -- the inherit statelessness of HTTP makes it a more difficult application development paradigm to wrap your head around (or at least you have to do a little more thinking than just slapping controls on a panel).

Don't forget -- you have technologies like Silverlight and Adobe Flex/AIR which can straddle the line between desktop/web application.

Watson
  • 2,262
  • 14
  • 16
  • +1 for web development being more difficult. I started as a desktop developer and had to move into web development on the job. It's definitely more complex (obviously this is assuming comparable tasks, which isn't easy). – Bobby Tables Nov 19 '10 at 00:36
  • @Guzica -- yes I've encountered a similar attitude from good devs that I've worked with who were locked onto desktop app development. Once they try to make the switch its not as easy for them as they first thought. I don't know if this is any inherent complexity in web app programming, its just a different way to program, and a lot of your underlying assumptions of what the system can do must change (in addition to learning new frameworks). – Watson Nov 19 '10 at 13:41
  • It's always harder to do things with tools that are more limited, that doesn't make it "more complex". It makes it more of a hassle. – Sam Nov 29 '11 at 13:08
0

According to the IE9 team:

There shouldn’t be a gap between native and web apps. HW acceleration, fast JS and site pinning starts it off

I think it's a safe bet that these technologies will grow closer together. If you're a java developer, there's very little difference between developing desktop apps and web apps (using GWT). It's not unreasonable to expect more and more "desktop" development platforms to be able to target the browser engine. It's also not unreasonable to expect more and more desktop apps to have a web-like distribution model (auto-updating in the background, sandboxed execution, like chrome).

Joeri Sebrechts
  • 12,922
  • 3
  • 29
  • 39
  • 3
    That is total BS. I am working on a latency measurement app that has to be located locally in order to measure and display in "real-time" the latencies of market data delivery. This kind of thing will NEVER be moved to the cloud. – Tim Nov 18 '10 at 20:57
  • It's total BS because you've found a ridiculously obscure need for an application to be local? – Mike M. Nov 18 '10 at 21:37
  • @Tim: you're right that some apps will always be local. It's also true that other apps could never be local (e.g. google translate). But, running locally does not imply that it doesn't come from the cloud. Chrome runs locally but is a cloud-based app (you have little control over what "version" it is). There are attempts to tie native code execution into browser platforms (Google NaCl), and attempts to tie web languages into native apps (Adobe Air). – Joeri Sebrechts Nov 19 '10 at 06:25
  • 1
    @Mike M - that's not ridiculously obscure. At my previous job I worked on Navy shipboard software. Those also are likely not to be in the cloud. The domains that I work in will likely not migrate - they have to be local for latency and hardware interface reasons. The web is nice, but some of us still work in the native app area for a reason. – Tim Nov 19 '10 at 14:37
  • @Tim My point is that you've found a scenario that it doesn't hold up. The author realizes this when he says MOST. You have come up with a counter-point, and that's great. You have by no means proven that the entire thing has no basis. Of course there will be times where it has to be local for a multitude of reasons. But for the vast majority, throw in some fiber optic cable and your 1200 miles can be crossed in what, 10 milliseconds? As a user I would be okay with every single one of my desktop applications taking 10 milliseconds longer to load a form. – Mike M. Nov 19 '10 at 15:50