The big question I have in my mind: how many developers are brownfield ("enterprise") compared to greenfield (all new code, from the ground up).
I'm constantly reading breathless articles about the latest technology, only to find out that It Just Won't Work On Our Enterprise Software codebase. People aren't ready for automated testing (because the logic is in the click-handlers and/or database). People aren't ready for ORM tools because we have horrendous amount of logic in stored procs and triggers. People aren't ready for WPF because our existing stuff is all WinForms. We can't get the latest version of Reactive Extensions because existing code used RX 1.0 and there are breaking changes that will require more testing effort than is justified by the return. Etc., etc., etc.
Very few articles seem to be oriented toward the brownfield developer, for whatever reason (can't sell ads for articles that start off with "you probably can't use this, but..."?).
So, I'm truly wondering: is the software development industry just chock full of greenfield developers, developing new projects for clients which are then released and enjoy a short existence until complete replacement for whatever reason? Or are there hordes of brownfield programmers silently laboring away in the ADO.NET T/SQL VB.NET software mines, looking wistfully up at the sunshine of Entity Framework 5.0 and Haskell, et cetera?
How do we even measure that? Salaries (wages?) paid to software engineers* in the two categories? How do we measure THAT? Maybe... revenue generated from selling said software? (There's an assumption that the crappy old software sold by XYZ Corp. actually has maintainers).
My question: Does anybody have any numbers that speaks to how much of the industry is green field vs. brown field?