12

I believe that I read somewhere that Google has a rule of thumb that an excellent developer is around 300 times more productive than an average one. Does anyone have any rules of thumb used by large companies or maybe there are even some empirical studies on this?

David
  • 4,449
  • 6
  • 35
  • 48
  • 2
    How do you quanitfy/measure "Excellent" and "Average"? What unit is it measured in? – FrustratedWithFormsDesigner May 02 '11 at 21:08
  • 5
    300 times? Damn, thats a lot! I personally figured 10 times might be reasonable. But if google says 300, I'd better change my mind :) – Martin Wickman May 02 '11 at 21:11
  • @FrustratedWithFormsDesigner Rules of thumb are not well defined. I would be interested to see any studies on this, however they choose to define it. – David May 02 '11 at 21:13
  • @Martin Wickman I have thought a lot about this and I believe that the biggest reason why they are more productive is, because they will choose the right tools (like programming languages) from the beginning, and not for example have to switch to a different language after production tests, because of for example scalability issues. – David May 02 '11 at 21:16
  • 12
    Attempts to put numbers on it tend to break down due to the simple fact that an excellent developer can do things which an average one cannot - no matter how much time they are given. – Carson63000 May 02 '11 at 21:17
  • 1
    @David: Well, my department's application is written in Java, runs on the WebSphere platform, and they use RAD to work on it. I guess I can't be as productive as possible because the tools are already chosen as corporate standard, and they're not what *I* would have chosen (vi, LISP, and a custom web server written in C). Bummer. :( – FrustratedWithFormsDesigner May 02 '11 at 21:20
  • @David. I agree with your reasoning, but 300... If we assume productivity is measured in time, then a team of excellent programmers completing a project in *six month* would take an average team **150 years** to complete. – Martin Wickman May 02 '11 at 21:20
  • 1
    @Martin Wickman Given that such a team would die out several times and freshly hired average programmers would have to read code of average quality that may be more than a hundred years old, I believe your calculation supports that the number should be 300. – David May 02 '11 at 21:39
  • 1
    Uhm, I think it's 200. At least that's the error code Borland Pascal will give you when you try calculate it. – back2dos May 02 '11 at 22:35
  • If you define excellent developer as a skill level, then it rather depends still on their work ethic. Experts can be lazy as hell, particularly if they get bored. People who are learning may still have great enthusiasm. – Orbling May 11 '11 at 08:09
  • Such extreme differences are only plausible if we assume that excellent developers use the API while average programmers reinvent the wheel. – user281377 May 11 '11 at 08:25
  • @AmmoQ. Yeah, that's a typical difference between the excellent and the average. – David May 11 '11 at 08:33
  • how do you quantify productive? That's sometimes the difficulty. – temptar May 11 '11 at 09:37
  • 6
    A factor of 300 basically says that the excellent programmer can do in 1 day what it takes the average programmer 1 1/2 years. In general it doesn't seem plausible that 300 is even remotely realistic. However, there are certainly software systems that an average programmer will never be able to design and get working. In that case, the productivity factor is infinite. – Dunk May 11 '11 at 16:04
  • Check out [this article](http://www.joelonsoftware.com/articles/HighNotes.html) by Joel Spolsky, it contains some data that will help you reach your own conclusions. – Soronthar May 02 '11 at 22:25
  • and define "productive". Define "good programmer". Some days I write 1000 lines of code, some days I write 10. Am I 10 times as productive on those days I write 10x the number of lines of code? And if so, am I a 10x better programmer on those days than on others? – jwenting Sep 26 '14 at 12:50
  • Don't trust everything Google says, just because it's branded Google. 300x is clearly ackamarackus; use your own intelligence to evaluate, don't blindly follow the crowd/trend. – Kamafeather Jul 14 '22 at 14:41

6 Answers6

20

Different studies have found different answers to this question, usually in the 10-fold to 100-fold range.

The one I trust most is the classic book Peopleware. They reported on a set of coding wars that they had between programmers from different companies, with the programmers in the corporate environment. They found that factor of 10 differences between developers were common. However they also found that much of the difference could be attributed to corporate environment. (Quiet working conditions, more floor space, lots of whiteboards, etc.) Their question didn't answer whether good developers gravitate towards the better corporate environment or whether the environment enables developers to produce more.

Extreme examples should be bigger than this initial range. And, of course, there is the often made observation that certain tasks can only be done by programmers who have learned certain things. For instance I would find it fairly easy to knock off a dynamic programming problem that most professional developers I've seen would never be able to figure out. That happens to be a technique that I have that they don't, and they will probably never think of it on their own. Of course lots of other programmers have that particular technique, and it is no harder for them than it is for me...

btilly
  • 18,250
  • 1
  • 49
  • 75
6

The studies I've seen suggest a factor of ten between best and worst (measured by time taken to accomplish tasks of small or medium difficulty), and my interpretation of the data suggests this might be conservative.

It could be that the lower end degrades disproportionately fast given big, complicated, and/or innovative projects. This is only speculation, but if it does apply it might lead to a much greater differential on Google projects.

It could be that Google is measuring how many programmers are needed to do a given task in a given time. Brooks suggested that three times the number of programmers could do twice the work in a given time. This suggests that, to equal one 10, you'd need something like thirty 1s, and so you might rate a 10 as thirty times as productive as a 1. (And, yes, this implies large projects so you can get large teams up and running. I can complete a simple project in less time than it takes to introduce thirty people to each other.)

Neither of these are likely to give us a factor of 300. Suppose that, for a given type of project, an excellent programmer is forty times as good as a mediocre one, which requires that the mediocre one's effectiveness degrades four times as much. Using the "how many programmers" formula from the last paragraph, it would seem that somewhere close to three hundred floundering 1s could match a challenged 10.

This is a lot of speculation, but it would explain the 300 number. It's also possible it was misreported, or that Google's just exaggerating the difference for some reason.

David Thornley
  • 20,238
  • 2
  • 55
  • 82
  • 1
    I would be interested to see those studies. Do you have a reference? – David May 02 '11 at 21:47
  • @David: Specifically, de Marco and Lister's book "Peopleware", and a "Hitting the High Notes" column in Joel on Software. – David Thornley May 02 '11 at 21:49
  • you're probably right about the difficulties that are/were specific to google - large, distributed systems were relatively rare before and Google no doubt struggled to find programmers who could understand the requirements. –  May 03 '11 at 02:26
6

I don't see how you could even put a ball park figure to such a comparison. There just too many factors involve including the environment (physical and development) the engineers are working in, the team that the engineers are working with, the type of problems to be solved etc....and there is also the question of how productivity is measured.

tehnyit
  • 1,469
  • 2
  • 10
  • 16
4

It's hard to say and based one a lot of uncertain terms, how efficient is the "average" programmer, how fast is an "excellent" programmer?

300 does seem a bit high though, I'd go more in the 50-100 times range.

The main thing I think people underestimate with programming is that it's not only the actual skill. An excellent programmer will have build up libraries and tools. Obviously if I just have to dip into my library to fetch out something I've already done and you have to spend a month coding it, I will be extremely more efficient.

If I've built code generating tools where I can just write a configuration file and press a button to generate a bunch of classes where you have a couple of days coding the classes, of course I'm going to be extremely more efficient.

That being said, never think you've reached your peak as a programmer. I find that programmers are like Moore's law, they double in effiency every two years :)

Sharpen your saws gentlemen

Homde
  • 11,104
  • 3
  • 40
  • 68
-1

It's certainly not a linear scale. I do believe a good developer (strong SOLID OO) is much more worth the money than average developers - but it depends on what the client needs. If the code is likely to be changing or reusable in the future - then definitely get 1 strong (and expensive) developer. If the code is short term and will be disposed off ... get a few copy&paste / spaghetti developers. [13yrs experience as OO developer]

john
  • 1
-2

Millions (hundreds of millions perphaps) of people play soccer, yet very few can score a goal in the semifinal of the world cup. You need to remember that excellent developers are few, and they practice their mental skills relentlessly, always improving, always sharpening them. Indeed, in the coder's world cup, they score the goals on demand, under very high pressure, keeping their cool.

-- This Rah-Rah Was Brought To You By The Church of Programming. Code Contributions Welcome!

Christopher Mahan
  • 3,404
  • 19
  • 22