42

It seems to me that the programming industry is in a race to the bottom. If we take the practices of:

  1. Not taking time to implement best practices
  2. Using other's people code as much as possible (custom code as a liability)
  3. Using increasingly higher level languages to improve productivity
  4. GUI based development "tools" that greatly simplify "programming" and do not require people to understand the plumbing behind the code

These things imply to me that we are in a race to becoming like any other office worker. It is in the employer's interest for things to not require skill (easier to replace), for things to be prebuilt (less project time).

My point here is that a) is there a misalignment between skill and the economic interests of the employer? and b) if there is, how do you mitigate it to enforce professional standards?

yannis
  • 39,547
  • 40
  • 183
  • 216
q303
  • 2,756
  • 20
  • 28
  • 28
    Well, someone still has to make those tools. So in some sense it's a race to keep good programmers away from boring work. – Jeremy Feb 07 '11 at 20:47
  • 11
    Why anyone believes that the future of software development will boil down to dragging and dropping components is beyond me. Seriously, do you honestly believe it will be that easy. – Pemdas Feb 07 '11 at 21:10
  • 3
    @Pemdas: Human fear if progress and/or change. Steam locomotive 150 years ago was perceived as evil. –  Feb 07 '11 at 21:20
  • 4
    @pierre I am not sure I understand where you are going with that. – Pemdas Feb 07 '11 at 21:22
  • @Pemdas: There is a typo in my comment. I wanted to say that any we tend to be afraid by any improvement/change that may affect the status quo. Negatively. q303 made me think about that. –  Feb 07 '11 at 21:32
  • yeah...i am more afraid of the fact that all of these high level abstractions will make my life harder not easier. – Pemdas Feb 07 '11 at 21:47
  • @Pemdas: you and I don't think it will be that easy, but the people making hiring decisions do. – zetetic Feb 07 '11 at 21:58
  • This argument comes up again and again, but if you look back at the last time people mentioned it you see how dead wrong they were. – Nick T Feb 08 '11 at 01:15
  • @q303, what would you like the most to be using when on a deadline? Shovel and wheelbarrow, or a Bagger 288? http://en.wikipedia.org/wiki/Bagger_288 –  Feb 08 '11 at 09:10
  • 3
    Dijkstra, is that you? – l0b0 Feb 08 '11 at 09:32
  • @Andersen: I'm not saying its not the best way to solve the problem. However, if you look at the 3 trends I mentioned, there's something they all have in common: less skill. – q303 Feb 08 '11 at 16:31
  • Who is taking time to determine the best practices? Who has the authority to say, "This is how software should be made," currently? – JB King Feb 09 '11 at 21:33
  • This question is more than 10 years old and boy was it right on the dot! – ppbitb Jan 10 '23 at 15:20

13 Answers13

59

To the trends you mention I would add one more, which IMHO explains them:

There is vastly more programmers (needed) than ever.

The amount of tasks which require or include programming is ever increasing, and in an even higher rate than the number of programmers. Nowadays there are several microchips in an average car. In 5 years there may be a chip in your fridge and your toaster. In 10 years, your underwear?... And someone needs to produce all that software to make these work. So there is every possible effort made to automate whatever is automatable, and to improve "productivity" (however it is defined). And more and more fresh brains are recruited.

This implies that the majority of today's active programmers are inexperienced and/or ill prepared for their job. It takes several years to get to an adequate level of experience and it takes constant learning to keep yourself there. The bottom line is, more and more of the programming jobs are becoming less and less challenging. But there is still enough challenges for anyone who is looking for them.

Let me play the devil's advocate against your points above:

Not taking time to implement best practices

A lot of people don't, a lot of people do. Tenish years ago when I first discovered unit testing and the agile approach, none of my colleagues had the slightest idea what it was. Nowadays it is almost standard material at universities, so many fresh graduates already understand it.

Using other's people code as much as possible (custom code as a liability)

As opposed to what? Reinventing the wheel? Or using other people's code to avoid that?

I think it is important to note that we are paid (mostly) to solve problems, and writing code is not the end, only the means to that. If a problem can be solved without writing a single line of code, it still makes the client happy. Especially if this way we manage to produce a more reliable solution faster and cheaper. I don't see any problem with that.

Using increasingly higher level languages to improve productivity

As opposed to coding everything in assembly? ;-)

GUI based development "tools" that greatly simplify "programming" and do not require people to understand the plumbing behind the code

IMHO any tool can be misused. Which is not to say that GUI builders were necessarily perfect or even good - most (or at least some) of them are usable within their limits. But if someone doesn't know those limits, is it a problem of the tool or its user?

In general, I believe (although have no evidence to prove it) that back in the punch card and machine code days, roughly the same proportion of existing code was horrible as now, just both

  • the overall amount of code, and
  • the chances of outsiders ever seeing such code

was much much less.

Now, with the Internet and the Daily WTF, we get exposed to the worst examples day by day. It's a bit like watching all the news about terrorism and earthquakes and divorcing celebs, and crying out how dangerous and immoral this world became.

Péter Török
  • 46,427
  • 16
  • 160
  • 185
  • +1 I guess we are in a "every solution breeds new problems" feedback cycle -- can't say if it's a negative or a positive loop. – Maglob Feb 07 '11 at 21:11
  • 6
    +1 If only good coders rewrite everything from scratch, then I will happily be a crappy programmer. – AndrewKS Feb 07 '11 at 21:47
  • 1
    I don't want chips in my underwear unless the uptime is acceptable! –  Feb 07 '11 at 23:06
  • @Thorbjørn: what's acceptable uptime for underwear? If it was self-washing then I'd worry about uptime... otherwise you gotta take them off every day anyway (I hope!) – Dean Harding Feb 07 '11 at 23:52
  • +1 for we are paid (mostly) to solve problems, and writing code is not the end, only the means to that – Zekta Chan Feb 08 '11 at 01:25
  • +1 for "we need more programmers", very astute. Also, I'll happily offload the simple tasks to any moron that can drag'n'drool. I've trained a tech writer to add linked help, I've trained BA's to design prototypes in VB, it's all about maximising the time I spend solving problems that other people I work with can't solve. –  Feb 08 '11 at 04:11
  • In my experience, the number of good programmers seems to be fairly static, so the ratio of good to bad is going down. – Skizz Feb 08 '11 at 09:51
  • I disagree with that big fat bold line. There's a lot of NNPPs out there. There is a need for better programmers, not for more. This is also what many managers don't understand. IMHO, there are far too many programmers, many of them are not only crappy, but simply aren't eager to get better and lack the passion for this job. Managers then hire tons of poor programmers and pay a lot of money for development tools. And I don't speak of tools that push productivity even further, but of tools that lower the barrier in the first place. – back2dos Feb 08 '11 at 10:32
  • 1
    @back2dos, I don't consider this a disagreement. The bold line states the trend, or companies/managers view if you wish; you state the developers' view. I fully agree with you that it would be ideal to have better programmers, more practical education, mentoring, to raise the level of quality in the industry. However, the sad reality is different. Software has become a commodity so many people expect to get it fast and cheap, without understanding implications of such decisions (such as short vs long term costs etc.). – Péter Török Feb 08 '11 at 10:42
  • @back2dos, moreover, NNPPs shouldn't need to stay like that forever; all of us have been NNPPs temporarily, whenever started in a new workplace. It also depends on the more senior teammates whether they let the juniors and newcomers produce crap, or prod them to learn and develop better working habits. – Péter Török Feb 08 '11 at 10:46
  • OK, then we agree :) – back2dos Feb 08 '11 at 10:59
  • @Skizz, my theory about this is that back then, if you had crap code in e.g. assembly, it almost always failed fast and spectacularly. So only those programs got into real use which were already fairly high standard. Nowadays, with the plethora of tools helping you to create safer and higher level code (and also the scope of projects being vastly bigger), there is an awful lot of room for half understood, crappy and buggy code which still manages to do something useful for someone, so it gets into production and is maintained forever :-( – Péter Török Feb 08 '11 at 11:25
  • "In 5 years there may be a chip in your fridge and your toaster." Welcome to 2016! Your home appliances have had µControllers in them since at least the '80s. – oosterwal Feb 08 '11 at 17:43
  • @oosterwal, fair enough, let's clarify: do your home appliances have their own URL yet? – Péter Török Feb 09 '11 at 08:33
  • @Péter Török: Some of mine do, but, to your point, that's only because I write the software that goes in them. – oosterwal Feb 09 '11 at 15:50
29

You summarize the situation correctly, but completely misinterpret the meaning.

As software becomes more powerful, the tasks it takes on scale with it. So sure, it may not be necessary nowadays to be a database programmer to create a phone contact database when you can use Access. And it may not be necessary to be a web programmer to set up a blog when you can just go to wordpress. But, while 15 years ago it would be necessary to be a programmer to do those things, what programmers do now is orders of magnitude greater than what could be done 15 years ago.

Let me put it this way, 100 years from now, it will be trivial to set up a system as complex as facebook or google. I'm not talking their web pages, I mean their data centers. Anyone will be able to do it. It'll be built into phones, assuming we even still use them. BUT, there will still be programmers, and while they may not be working on such 'trivial' systems 100 years from now, they will be working on systems so much more complex and sophisticated that no one here can even begin to comprehend their scope and scale. And those systems, like the ones now, will be far out of reach of the average office worker because some people, called programmers, will chose to specialize in pushing the technology of that era to its extremes.

GrandmasterB
  • 37,990
  • 7
  • 78
  • 131
  • Interesting point of view... – q303 Feb 07 '11 at 21:54
  • 10
    I'd like GrandmasterB to write some sci-fi. – ocodo Feb 07 '11 at 22:05
  • 5
    Can't wait to have my own google data center on my phone. – Martin York Feb 08 '11 at 00:16
  • 3
    @Slomojo Not as science fiction as you may think. When I was a kid in the 3rd grade I visited a computer demonstration at the college near my house. It was an experimental showcase to the public. One of the programs on display was a playable game of tic-tac-toe. At the time it was considered an oh and ah moment to be able to play a game against a computer. This was in the late 60's. What will the oh and ah moments be a 100 years from now? – Bill Feb 08 '11 at 00:32
  • I was referring to the way he told it, not that the content was the stuff of *fantasy*, I was around when **Pong** was revolutionary, I'm fairly sure that Nintendo kids can also relate to the exponential changes in technology too. – ocodo Feb 08 '11 at 00:48
  • @Slomojo: Read some Neal Stephenson. :) – Macke Feb 08 '11 at 12:40
  • @Marcus - appreciate the tip, I'll have a look. – ocodo Feb 08 '11 at 12:45
  • @Slomojo: Diamong Age is a cool book by Stephenson that I'd recommend. I realized that I was previously actually thinking about Charles Stross's Accelerando: http://www.amazon.com/Accelerando-Singularity-Charles-Stross/dp/0441014151 .. The first third of the book is amazing and a must-read by everyone. :) – Macke Feb 08 '11 at 13:00
  • I don't actually think there is going to be a lot of progress in 100 years on the computing front. What we need are advances in energy. We need to set the stage to continue innovating at this rate by developing better forms of energy. – q303 Feb 09 '11 at 17:38
18

I've read that sort of thing for forty years now, and I wasn't in on the beginning of the predictions. It hasn't happened yet.

COBOL was the original development tool intended to remove the need for business programmers, and was a much higher-level language than assembler. Code libraries (to avoid having to write one's own code) are of similar antiquity.

Every so often, something comes up that allows nonprogrammers to do something more like programming work. There were the 1980s "fourth-generation languages", fancy spreadsheets like Excel, report generators, and the like. What they have uniformly done, if successful, is remove some of the scutwork from programming and allow programmers to do other, more interesting, things.

This pattern won't last forever, but I'm going to need something more than theoretical arguments and predictions to convince me that programming is indeed going downhill.

The issue from COBOL to modern development tools is that they don't replace the ability to take an inexact specification and turn it into something precise and useful. That is the fundamental ability of programmers, and why we're not going away for a long time.

David Thornley
  • 20,238
  • 2
  • 55
  • 82
7

I think you've asked a very poignant question.

Creating GUI coding tools puts programmers out of a job as much as robots put assembly line workers out of a job. In my opinion, while there is a loss of jobs, there is also a shift in where the next jobs are.

The technology to get a task done changes, but the task still needs to be completed: The cars still need to be made/assembled before they can be driven; the code/business logic still needs to be put together for the application to function.

Shifts in technology present choices for programmers: Learn GUI-based programming or program GUI tools ... or ... something else entirely.

There can be a misalignment between the skills of the employees and the interests of the employer, but not for long, especially if the employer is savvy. It is therefore in the best interest of both employer and employee that they pursue what is to their mutual benefit. But one will inevitably be ahead of the other. Hopefully it is you (-:

Best wishes,

KM

KM.
  • 752
  • 7
  • 13
  • My thinking is to focus on more specialized software development: mathematical, statistical and computational intensive programming (though the 3rd may be going out of style with increases in VM power). I don't think these specialized domains are in that race to the bottom, though I don't have experience in them so I could be wrong. – q303 Feb 07 '11 at 21:22
  • @q303: There will always be plenty of applications that will take up all available computer power. – David Thornley Feb 07 '11 at 21:36
3

Assembly and FORTRAN programmers probably said the same things when structured programming and widespread interpreters were coming into the picture.

At the end of the day, software is a creation meant to automate something which previously had been done by hand. An accounting department in a moderate corporation would have previously required dozens of people, now all that work can be accomplished by the work of one or two. As a result, the goal posts have moved and we now expect that extra capability standard from an accountant.

It takes Pixar longer to render movies than ever before. Despite the enormous increase in computing speed, along with it, the animators have demanded ever increasing complexity and detail in their scenes. Toy Story caliber animation is not acceptable in 2010 like it was in 1995. As their tools have advanced, so have their capabilities and thus expectations.

When drag and drop or other programming methodologies are common-place, the world will demand solutions to even more challenging and complex problems, and they will need programmers using those newer fancy tools to solve them. The goal posts will move.

whatsisname
  • 27,463
  • 14
  • 73
  • 93
3

While I agree with most of the answers that point out that there will still be plenty of work to do I will give a different answer for you to consider:

Yes it is, and it SHOULD be

I am here to design a solution to problems that others cannot. Anything in the toolset that takes my time away from solving problems to deal with all the small details of how to implement my design is waste.

The only reason I would fear going to a higher level language or a simpler and more abstracted tool is that those tools often make assumptions that get in my way and require my time to work around the assumptions to get the desired implementation.

There are always more problems to solve than there are good problem solvers. Even if the entire dev chain became so simple preschoolers could use it most of the solutions designed would be insufficient or inefficient enough that people would pay for a better solution because most people are just bad at seeing the correct solution to issues with all the edge cases and what-ifs that you need to consider to make a good solution.

Our jobs are to solve problems better than most of the rest, the toolchain complexity is not terribly relevant in the big picture view as long as it gets out of the way and lets you build and you build something good.

Bill
  • 8,330
  • 24
  • 52
1

Even though programming technologies may change, the underlying complexity of a software product is still there. If software could be written completely using diagrams or flow charts (or some other 'simple' way), all of the complexity of the software still needs to be understood and addressed. For that reason, it is important for employers to have programmer(s) who know the company's products inside-out in order to update them or add new features. And it can take quite a while for employees to learn a software product.

Jon Onstott
  • 1,538
  • 12
  • 18
1

No matter what you can automate or pull off the shelf, most packaged software falls into two categories:

  1. It's simple to use, but it doesn't quite match what the business really needs
  2. It's highly customizable, but it takes a specialist to understand and leverage the customization

I guess I forgot the third It's standard productivity software (email, browser, word proc, etc. The software in category one leads to businesses hiring software developers to customize the off the shelf software (if that's possible). The category 2 software, well they might as well hire a developer because the person who knows that customizable system inside and out is either highly sought after (think SAP, PeopleSoft, etc.) or a dying breed (think any system similar to SAP and PeopleSoft that doesn't quite have the same market penetration).

There will always be a need for developers. What I am seeing is that more of what used to be manual, tedious, repetitive work has become automated (think writing code for data access by hand versus using an O/RM). This allows developers to deliver more value to the business with less effort.

Michael Brown
  • 21,684
  • 3
  • 46
  • 83
1

I don't take your arguments:

  1. Not taking time to implement best practices

except that.

  1. Using other's people code as much as possible (custom code as a liability)

Code reuse is a best practice. Used code is tested. Of course you should use code from good sources, which is maintained and used by many.

  1. Using increasingly higher level languages to improve productivity

Productivity isn't bad per se - is it?

  1. GUI based development "tools" that greatly simplify "programming" and do not require people to understand the plumbing behind the code

If the tool does the job: Use it. If not: refuse it. If the promise holds, and people really don't need to understand the code - well done! You seem to tell that this is a promise which doesn't hold?

(The numbers here are automatically rerendered wrong. :) )

user unknown
  • 797
  • 7
  • 14
  • The point is that to be effective, you need less skill. There is nothing inherently bad about GUI based development "tools." What is bad about them is that repeated use reduces the skill level required to do what we do. The same thing goes for using other people's code: eventually, you start programming on the Google platform. Finally, higher level languages abstract a lot of subtleties away that, again, require skill. None of this is bad from an employer, project management point of view. It does, however, make me question the profession's future. – q303 Feb 08 '11 at 16:27
  • It all depends on your requierements. When I don't need a fine-tuned special-purpose sorting technique for specifically distributed data, I can perfectly use libraries with some quicksort algorithm. Why should I lern it before I need it? Maybe I need the time to learn something else - fontkerning, database-access, GUI-design ... - you name it. Skills are nice to have, but there are too much skills you could have. Sometimes it is right to say: YAGNI. – user unknown Feb 08 '11 at 16:34
1

Visual programming is no less valid or deserving of scorn than text based programming.

There are certain deficits and challenges when programming visually, but the underlying component plumbing being potentially perilous when ignored is not monopolized by visual components and more relevantly, visual designers. Underlying plumbing being ignored is a risk to any development.

If you get a chance to try Labview, you might see the potential (or even a specialized variant in the Lego NXT environment). While not without flaws or shortcomings, there are inherit benefits. Seeing is believing.

JustinC
  • 3,306
  • 17
  • 20
0

Programming practices not only increase productivity and reduce costs for certain types of development (your race to the bottom), but increase application capabilities and customer expectations (thus encouraging a race to the top). Witness the bonuses that Goole and Facebook are paying to get top technologists.

hotpaw2
  • 7,938
  • 4
  • 21
  • 47
0

There's no other profession that deals with existence engineering that strives towards change as much as programming. As soon as you simplify something, you just open a can to new problems that breed new ideas.

So I wouldn't soil other people's endeavors to share code and solutions in order to help us move towards novel challenges, ideas and user experiences with the individual practitioner's bad practices, bad habits and manners devoid of humanism.

Filip Dupanović
  • 1,215
  • 8
  • 13
-2

If we take the practices of:

  • Not taking time to implement best practices
  • Using other's people code as much as possible (custom code as a liability)

WTF? Did you mean for this list to be inconsistent? Lists should come from the same side for each element rather than switching back and forth without warning. Thus your list should read:

If we take the practices of:

  • Not taking time to implement best practices
  • Not using other's people code as much as possible (custom code as a liability)

Edward Strange
  • 9,172
  • 2
  • 36
  • 48
  • 1
    @NoahRoberts: Your edit changes the meaning of the second bullet point. This is also not an appropriate answer to the question and should've been a comment instead. – Adam Lear Feb 07 '11 at 21:35
  • @Anna - this wasn't an edit, of course. That's why it doesn't appear as changes to the original question. It IS an answer because it addresses a flawed premise in the question. – Edward Strange Feb 07 '11 at 21:39
  • What is the premise? – q303 Feb 07 '11 at 21:47
  • 3
    @NoahRoberts: It is a bit oddly worded, but I believe the list is consistent in its meaning -- q303 is taking "using other people's existing code instead of writing custom code in-house" as a supporting point in his argument. – Adam Lear Feb 07 '11 at 21:54
  • @q303 - Apparently that using other peoples' code as much as possible is a bad practice. That's what I'd get out of your list anyway. – Edward Strange Feb 07 '11 at 22:43
  • @Crazy Eddie: Well, if you don't understand it, I think it is a bad practice to just use it. – q303 Feb 09 '11 at 17:39