130

Suppose I give my developers a screaming fast machine. WPF-based VS2010 loads very quickly. The developer then creates a WPF or WPF/e application that runs fine on his box, but much slower in the real world.

This question has two parts...

1) If I give a developer a slower machine, does that mean that the resulting code may be faster or more efficient?

2) What can I do to give my developers a fast IDE experience, while giving 'typical' runtime experiences?

Update:

For the record, I'm preparing my even-handed response to management. This isn't my idea, and you folks are helping me correct the misguided requests of my client. Thanks for giving me more ammunition, and references to where and when to approach this. I've +1'ed valid use cases such as:
- specific server side programming optimizations
- test labs
- the possibly buying a better server instead of top of the line graphics cards

makerofthings7
  • 6,038
  • 4
  • 39
  • 77
  • 20
    Maybe have them test the application in a virtual PC! – Mark C Oct 21 '10 at 19:13
  • 209
    I'm speechless that this is even a question. How could it result in anything other than slower development and poor morale? – Fosco Oct 21 '10 at 19:23
  • A slower machine will always slow down the job progress. Of course you should give to those who work really hard and need it. – Junior Mayhé Oct 21 '10 at 22:52
  • 76
    Develop on the state-of-the-art. Test on the worst machine you can find. – Adam Oct 21 '10 at 23:34
  • 5
    This question could only be asked by someone who does not do development – Hogan Oct 22 '10 at 03:52
  • @Hogan - I do my fastest WPF coding on XMLPad using a machine with a poor GPU. I get instant feedback on what is fast or not. The unfortunate news is that managment took wind of my practice and want to apply it to other aspects of development. – makerofthings7 Oct 22 '10 at 04:50
  • 14
    Does cleaning the floor with a toothbrush rather than a mop result in a cleaner floor? Sure it does, eventually. A mop operator can't spot the dirt from 150cm away quite as well as a toothbrush operator from 30cm away. Don't try with a large floor. – dbkk Oct 22 '10 at 09:33
  • 2
    The screaming fast mashines are not from the real world? What world are they from? In any case, the software you design and build now is going to be around and run on faster machines that don't even exist yet. Doesn't it make more sense to write for that instead of machines that are being phased out as you write your code? – MAK Oct 22 '10 at 09:44
  • 2
    @MakerOfThinkgs7 I must to admit that my relative slow PC make me optimize my programs to make them run faster and faster. From that, I have learn now to make them as fast as possible at the beginning. – Aristos Oct 22 '10 at 11:10
  • 4
    Horrible question. I mean, c'mon. You increase programming efficiency by giving the best tools to the programmer, and then you do load/capacity/failure testing on environments that are much more constrained (in terms of resources) than the target environments the app is supposed to run. It's not rocket science. – luis.espinal Oct 22 '10 at 11:13
  • 5
    If we make programmers first type their code on a typewriter will it force them to be more efficient because every typo means they have to start over? – JD Isaacks Oct 22 '10 at 14:03
  • 2
    @Mark C: not all virtualization environments have a large performance drop. One of the very recent developments of virtualization is to build virtualization capabilities into modern hardware to operate at about the same speed as the host machine. – Ken Bloom Oct 22 '10 at 14:20
  • I agree with everyone who say this is a bad question. I mean why is this question getting up-vote like crazy exactly??? The answer is obvious for everyone who do programming, I'm sure. – n1ckp Oct 22 '10 at 14:36
  • 13
    Note to self: never work for MakerofThings7 – matt b Oct 22 '10 at 17:13
  • 4
    matt b - Rather, you probably don't want to work for my client.. they are the ones proposing this insantity... I'm just the messenger who (with the help of SE) is fighting back! – makerofthings7 Oct 22 '10 at 18:37
  • Ask yourself (well management): Is there so much variability in the machine they currently develop on that what the user will have? I mean, not much people are still running around with Pentium 100 and under. What are the current dev machine? What are the target machine? What is your application (you seems to be talking about server)? Give us this info and we'll can find a solution. – n1ckp Oct 22 '10 at 23:43
  • 1
    Seems to be a case of identify a valid problem but come up with the wrong solution. – Jon Hopkins Nov 12 '10 at 09:23
  • Bad questions need to be asked, too. The answers will be useful to any programmer who finds themselves in a situation where the people doing the purchasing decide this is a good way of cutting costs - under the guise of improving quality. That's why I up-voted. – xnine Nov 20 '10 at 03:25
  • @Fosco - there were developers decades ago. They had much slower machines than we have now, yet they were not noticably demotivated. Text editors are not resource hogs. Obviously a faster machine makes a difference when compiling etc, but using a low power PC (with passive cooling and virtually no noise) may be much better for your productivity than a high-end PC that spends most of its time twiddling its thumbs, yet sounds like a helicopter taking off. If it's a motivation issue, maybe it works for you, but I really don't understand the powerful PC as status symbol thing. –  Dec 07 '10 at 13:00
  • @Steve314 Decades ago, they didn't have faster machines taken away from them and replaced by slower machines. That's the demotivator. A fast PC isn't a status symbol, it's simply the best tool for the job. – Fosco Dec 07 '10 at 13:48
  • @Fosco - in effect they did, as more and more terminals were attached to the same old mainframe, so response time got slower and slower. –  Dec 07 '10 at 13:58
  • The build process itself will surely tend to be streamlined at least... Wouldn't really make a guarantee for the product however... – haylem Dec 07 '10 at 14:13
  • @Steve314 You're really reaching... – Fosco Dec 07 '10 at 16:46
  • 1
    @Fosco - more arguing the trivial but easy point really. I don't claim that better tools are irrelevant, but I do think that faster machines have diminishing returns for productivity, and in many ways the recent speed increases have given only marginal improvements in developer productivity. That is, in the last five years, your productivity probably hasn't improved eight-fold (following the Moores Law trend) - faster machines have more likely just given you back a few odd minutes here and there. It's not that big a deal - though reduced noise and other recent PC improvements may be. –  Dec 07 '10 at 21:47
  • Please follow this proposal for that kind of question: [office-work-and-desk-jobs](http://area51.stackexchange.com/proposals/22377/office-work-and-desk-jobs?referrer=Nx4kn5M-Wvu5FEmYvDudhQ2) – Maniero Dec 10 '10 at 13:43
  • If performance is really important just down allow any commits which result in slower benchmarks (assuming the benchmarks are measuring the right thing). The webkit project did this and it became very fast. – Benbob Dec 22 '10 at 06:46
  • I believe that we are all missing the big picture. This question indicates that the person asking does not know (and is willing to, therefore his question), how to manage his developers. So he responds to tricks like this, etc. Perhaps the question should be something more broad, like: how can I make my developers more productive? – Dimitrios Mistriotis Oct 22 '10 at 14:31

45 Answers45

376

The answer is (I'll be bold and say) always

NO.

Develop on the best you can get with your budget, and test on the min-max spec range of equipment you'll deploy to.

There's emulators, virtual machines, actual machines with testers that can all test performance to see if it's a factor.

peterchen
  • 1,127
  • 8
  • 15
Steven Evers
  • 28,200
  • 10
  • 75
  • 159
  • 10
    Can't vote this more than once, although I wish I could. I have an aging computer that I work on and the time it takes for VS2010 to accomplish some tasks (e.g. open the project) can be quite annoying. – rjzii Oct 21 '10 at 19:27
  • 108
    Can you please make the **no** very large and bold? – dr Hannibal Lecter Oct 21 '10 at 19:28
  • 4
    The acceptance testing you do should cover performance requirements. There should BE performance requirements. If you don't test the performance then your tester are called customers (and you charge them money). – Tim Williscroft Oct 22 '10 at 00:08
  • 2
    I agree completely. Giving a developer slower machines won't actually produce better code. The developer will get frustrated with the machine, and always have some uneasiness in their mind. Its makes worse code, and they can't concentrate much when everything gets stuck. See, one will be having an IDE like Eclipse, say 2 pdf books, 2 web browsers, one for running-debugging (in case of web based development), a server running, and a music player ;) Give him a slow machine and he will kiss you good bye. –  Oct 22 '10 at 09:18
  • Agree with you totally, because now I am sitting and such slow machine(Celeron 2ghz 2 gb ram) what with installed mysql and opened Chrome with 7-8 tabs debuging in eclipse is hell... So I am impatiently waiting for next week when QuadCore will come to me :) – artjom Oct 22 '10 at 10:50
  • +1 developing on slow machines only gives you slow (frustrated) developers. – cjstehno Oct 22 '10 at 21:45
  • 1
    The answer **no** is incorrect. The correct answer is **Nooooooooo!** – Pekka Oct 23 '10 at 14:12
  • I understand the reason for this question, as some programs that have been released lately are astonishingly resource hungry. However anyone in their right mind would realise that giving a developer a poor spec of machine will result in delays. If your worried about efficiency test on lower speed machines as everyone is suggesting, but also remember that the days of storing everything in Byte arrays are long gone, not only is it confusing but its also a pain in the neck to trouble shoot. Allow your programmers the flexibility to work in modern techniques with modern hardware. – Reallyethical Oct 24 '10 at 13:25
  • @dr. Hannibal Lecter: How could I **not** follow that request... – peterchen Nov 12 '10 at 12:53
  • Hahaha, indeed! I didn't expect this though, too bad we didn't get any points for this ;) – dr Hannibal Lecter Nov 12 '10 at 13:30
  • Definitely not. We have to develop on the same machines that the clients use and I can't stand it. It is slow and just hurts sometimes (we also don't have full time admin access... >_< ). <4 GB of ram is not enough to have 3+ copies of VS open with decent sized projects. Slow development machines = slow development = frustrated developers = poor results. – bdwakefield Nov 12 '10 at 14:32
  • The answer is actually **yes** if you only test on your development machines, but there are better ways to achieve the same results. – Claudiu Jan 07 '14 at 17:56
234

Absolutely

It's also true that managers should conduct all meetings in Pig-Latin. It improves their communication skills overall to have them disadvantaged when speaking simple sentences. They'll have to rely more on facial expressions and body language to get their point across and we all know that is at least 70% of all communication anyways.

CFOs should use only an abacus and chalk. Otherwise they end up relying too much on 'raw data' and not enough on their 'gut feel'.

And Vice Presidents and higher should be required to hold all important business meetings in distracting settings like golf courses while semi-intoxicated. Oh snap...

Jay Beavers
  • 2,321
  • 2
  • 18
  • 17
70

1) Very, very unlikely.No, and your developers may put something nasty in your coffee for suggesting it. Time your developers spend waiting for the code to compile or for the IDE to do whatever it's doing or whatever is time they're not spending making the code better. Disrupts their mental flow, too. Keep their minds on the problem, and they'll be much more efficient solving that problem.

2) Give them each a second PC representing the lowest specs you want them to actually support, with a KVM switch to go between the that and their real workstation.

BlairHippo
  • 8,663
  • 5
  • 41
  • 46
  • I like the idea of using a KVM with an old PC for testing. Depending on the project though it might be cumbersom for the devs to install the latest builds on the slow machine each time they come up with a new build. – Al Crowley Oct 22 '10 at 17:00
  • 4
    Another thing to consider is giving them an account on at least the second PC that doesn't have administrative privileges. – David Thornley Nov 19 '10 at 22:54
43

I like long compile times. It gives me more time to work on my resume.

Wonko the Sane
  • 3,172
  • 1
  • 24
  • 24
33

This is a terrible idea. You want your developers to be as productive as possible, which means giving them as fast a machine as possible, so they don't sit around all day waiting for things to compile. (Slightly OT, but it also helps not to block their access to potentially helpful sites with WebSense and the like.) If you are constrained by having users who are still running Stone-Age technology, then you'll need to have a test machine with similar specs, and be sure to test early and often to make sure that you aren't going down the wrong road in terms of technology choices.

o. nate
  • 271
  • 2
  • 3
  • who... wait a minute. If compiles were quick, this would no longer be possible: http://xkcd.com/303/ – gbjbaanb Apr 18 '11 at 12:25
32

Development should be done in the best environment that is feasible. Testing should be done in the worst environment that is feasible.

Yevgeniy Brikman
  • 2,565
  • 1
  • 22
  • 23
27

If I was given a slow machine I'd spend my day optimising the development process and not optimising my delivered code. So: NO!

26

Embedded systems programmers run into this all the time! And there's a two part solution:

  1. Your requirements need to specify X performance on Y hardware.
  2. Test on Y hardware, and when you don't get X performance, file bugs.

Then it won't matter what hardware your developers work on.

Once you've done that, let's say faster equipment can save your programmers a half-hour a day, or 125 hours in a year. And let's say they cost $100,000 a year with benefits and overhead (ridiculously low for Silicon Valley), or $50 an hour. That 125 hours * $50/hour is $6250. So if you spend anything less than $6250 a year on rockin' development hardware per programmer, you're saving money.

That's what you should tell your management.

Tim Williscroft pretty much said the first half of this in a comment, and in a just world, he would get half of any points this answer gets.


Added Oct. 24:

My ex-employer had that theory, and it helped them piss away about $100 million.

They're a Japanese-based conglomerate that was used to hiring programmers in Japan, Korea and China. Folks there are cool with using crappy development hardware, 13-hour work days, sleeping at their desks, and not having a life. So they figured when they acquired a noted Silicon Valley company to do a Linux-based cell phone OS, those silly Californians who wanted modern gear were just whiny prima-donnas and didn't actually have a good reason for it (like productivity).

Four years later, the OS worked like crap, all the schedules were blown, and the customers were pissed off and terminating contracts right and left. Finally, the OS project was cancelled, and a large percentage of the conglomerate's worldwide workforce was laid off over the last year. And frankly, I wouldn't want to have been one of the executives who had to explain to the stockholders where all that money and effort went.

It wasn't just the slow development machines that caused this fiasco. There were a lot of other strategic and tactical blunders - but they were that same kind of thing where the people working in the trenches could see the train wreck coming, and wondered why the decision-makers couldn't.

And the slow gear was certainly a factor. After all, if you're under the gun to deliver on time, is it really a smart thing to deliberately slow down the work?

Bob Murphy
  • 16,028
  • 3
  • 51
  • 77
20

In programming, there is an old saying that "premature optimization is the root of all evil". I think you have managed to successfully create another "root" (or at least first branch) of all evil. From now on, we can say "premature developer deoptimization is the root of all evil."

In short, the answer is that this will only slow up your development time and make further maintenance more difficult. Compile times will take longer, searching for code on disk will go slower, finding answers online will take longer, and MOST importantly, developers will start to use prematurely optimize their code in order to even to be able to test the needed functionality.

That last point is the most critical issue and isn't brought up in many of the other answers. You may get your first version out ok, but then when you want to update the code in the future, you will find that the developers premature optimization took the focus of your code away from good design and pushed it closer to "gotta make this at least work to keep my job" style of code. Adding additional features will become more difficult because the optimizations chosen at the time may be unneeded and lock your code into a path of semi-optimized hacks on top of other semi-optimized hacks.

As an example of this, imagine that your current version's minimum system requirement is a single processor machine of somewhat slow speed. You place developers on this box and they come up with a intricate single threaded solution that relies on a lot of hacks because they wanted to develop the product quickly. Now 5 years later, you have a new version of the product that has a minimum requirement of a dual processor machine. You would like to be able to cleanly separate out parts of the program that you can run in parallel but the decision you made 5 years ago that forced your developers to make a hacky software now prevents you from using the full power of your new minimum requirement.

What you should do is to add a phase at the end of your development cycle where you do acceptance testing on the lower bound boxes. Certainly some of the code will be too slow because of the developer's faster machine but you can isolate that part and optimize it there. The rest of your code stays clean and maintainable.

I see your question as saying, "Can I force my developers to optimize early by giving them poor developer machines yet still get good code?" And the answer is no.

  • "We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil". When designing something it's a good idea to think for 2 minutes about the 3%. – Benbob Dec 22 '10 at 07:01
15

Interesting reading, all those answers.

But I think most people answering here are missing the point. The question, as I read it is not (only at least) about really giving the developers a P1 to make faster code.

The point is that a lot of software today is just as slow or even slower than the seftware we used back in last millennium in spite of very much more powerful computers. Judging from the answers here most developers don't get that hint. This is very obvious in web applications. This site is a very good exception, but many sites are having a front page in 1 mb. What do I get for waiting for that to download? I don't know. I think it seems to be about an ignorance from the developer not respecting the time the user need to spend on it, or even worse pay for if you pay per mb. The thing is that all those web pages is not even containing high resolution pictures. Often it is just some crap code delivered from some development-environment. Well, of course it is not crap code I guess, but it gives no gain to me as user.

In general it is not only about optimizing the code, but just as much about choosing to not include things slowing down more than the it gives.

A few weeks ago I started a laptop from 1995. Windows 3.x was up and running in no time. The database I should get some data from started before the enter key was fully released (almost at least).

I know that we get a lot more from our software today, but we also have computers many times faster. Why doesn't the development industry decide to keep the speed of software from 1995 and make people buy new hardware because they want new functionality. Today it is more like the everyday-programs and web sites forces people to buy new hardware to do exactly the same things as they did earlier. But of course in a fancier way.

I have to say I think the Linux development seems to handle this better. Linux distributions has for many years been quite far ahead windows even in fanciness with many eye candy things like animated windows. The thing is that they have in spite of that worked on the computers of today and even yesterday. Not only on cutting edge hardware.

By now I guess many developers have an unhealthy level of adrenalin. Yes, I found a way to give back some frustration from all waiting in front of:
office sql server (starting up management console) arcgis (starting up and using) acrobat reader (starting up) agresso (using, at least as web application) windows (staring and using, well I haven't tried 7 yet) .net web pages (downloading)

and so on

I feel good :-)

Cheers
Nicklas

  • This. This. THIS. SO MUCH THIS. That has always been my biggest frustration with software. People spend more time trying to shiny up the interface than actually give a damn about the usability. One example of this is Android vs. Blackberry. Android looks nicer, and can do more, but Blackberry is a LOT more pleasant (and speedy) to use than Android, at least in my opinion. – Kevin Coppock Nov 19 '10 at 20:18
  • 1
    I completely agree with the argument about software being now just as fast as it was 20 years ago for just about the same functionalities. But getting devs to work on 20 years old hardware will do nothing to help the problem. If the company creating the software does not invest in usability and or performance testing developing on slower hardware will just make things worst if anything. This is a completely different debate altogether for which a programmer's head is not the only proper recipient of a well deserved slap behind the head. – Newtopian Apr 18 '11 at 06:10
10

1) If I give a developer a slower machine, does that mean that the resulting code may be faster or more efficient?

We have been building software for the last 6 decades, and we still get questions like these? Seems more like yet another attempt at cutting corners. No offense, but c'mon, do you think the question is even logical? Think about it in these terms (if you can): you want to build a 4x4 vehicle that can operate under harsh conditions, rain, mud, whatever. Are you going to put your engineers and assembly line under the elements just to make sure the resulting vehicle can operate on them?

I mean, Jesus Christ! There is development and there is testing. Testing is done in a different, harsher environment, or the developer knows how to assemble a test-bed in his own dev environment in a manner suitable for stress testing. If he can't, replace him with a better developer.

2) What can I do to give my developers a fast IDE experience, while giving 'typical' runtime experiences?

You should be asking that to your developers. And if they can't give you an objective and valid answer, you need to replace them with actual developers.

But to entertain the question, give your developers (assuming you have good developers), good tools and good hardware, the best you can afford. Then set up a lowest common baseline environment in which your software must operate. That's where testing should occur. It is much better engineering practice to have a test environment that is distinct from the development environment (preferably one that allows you do to stress testing.)

If your developers are any good, they should have communicated this to you (assuming you have asked them.)

luis.espinal
  • 2,560
  • 1
  • 20
  • 17
  • 1
    *We have been building software for the last 6 decades, and we still get questions like these?* - I upvoted your response, but I encourage you to examine the original question from a different perspective. There are in fact many managers who are ignorant of the *benefits* of fast, powerful machines for their developers. So with that in mind, the original question may have been trying to disabuse such managers of the ridiculous notion that *slow* machines can somehow nudge developers to produce faster and more efficient code. – Jim G. Oct 23 '10 at 14:19
  • 1
    "2) What can I do to give my developers a fast IDE experience, while giving 'typical' runtime experiences? You should be asking that to your developers." I believe this is a programmers' SE site, not a managers' SE site. He was asking the devs. – stimpy77 Oct 23 '10 at 23:10
  • 1
    "you want to build a 4x4 vehicle that can operate under harsh conditions, rain, mud, whatever. Are you going to put your engineers and assembly line under the elements just to make sure the resulting vehicle can operate on them?" <<< love the analogy – stimpy77 Oct 23 '10 at 23:13
6

It results in a bunch of bitchin' developers. This stuff is hard enough as it is, let's not make the experience worse.

I would encourge you to have similar hardware to your users in a Test or QA environment to smoke out any performance issues however. That's a good idea.

bigtang
  • 2,037
  • 1
  • 17
  • 15
6

I'll buck the norm and say yes IF AND ONLY if they're writing server software. Laugh all you want, but the most efficient team I ever saw was a group of Perl guys with Wyse terminals. This was late 1990s, was a University off-shoot shop, and they were writing spatial gridding software (which basically just calculates). They were however talking to some relatively powerful late-model RS/6000s.

Just to add interest to the event, there was a blind programmer there. I was thoroughly impressed.

alt text

Jé Queue
  • 3,937
  • 2
  • 29
  • 37
  • 3
    Blind programmer? Is that even possible? – WernerCD Oct 22 '10 at 00:46
  • 1
    @WernerCD, I to this day still try and envision the mind power it must take to keep track of lines of code in my head. – Jé Queue Oct 22 '10 at 01:53
  • 3
    Yes, most of us are writing server software... +1 – makerofthings7 Oct 22 '10 at 03:14
  • @MakerOfThings7, give me more server hardware anyday over my local machine, spend $ where it should be (but give me a big monitor :) ) I have no problems with my decade-old Dell Latitude CPx being a display for the big systems at the DC. – Jé Queue Oct 22 '10 at 04:25
  • 4
    Maybe a blind programmer could use a braille display? – Antsan Oct 22 '10 at 11:55
  • (Very rapid) Text-to-voice is more common, I would imagine. I worked across from a blind programmer at a previous job, and the guy was brilliant. He also had the cutest dog. :3 – Samantha Branham Oct 24 '10 at 05:46
  • without text to voice, or some other way to check the code, I don't see the way you can do any programming – grabah Mar 29 '11 at 14:03
5

This is not a bad idea - but you want your developers to have a speedy programming environment.

You could possibly implement this by giving your programmers two machines - a fast dev box, and a slower commodity box (possibly virtual) for testing.

Some tweaking of the VS build process could make deployment to the test box the norm, with remote debugging.

There are other ways to consider forcing your coders to develop more efficient code - you can include performance and memory-use goals in your unit tests, for example. Setting budgets for memory use is an excelent goal as well. Also setting page-weight budgets for html code.

5

The problem isn't the developer building inefficient code on a fast machine, the problem is that you haven't defined performance metrics that must be measured against.

There should be defined, as part of the product requirements, a specific target that can be measured on all computers based off of the required customer experience. There are many websites (Check SpecInt) that allow you to relate your computer to other types of computers.

This is good for many reasons. It allows you to define minimum supported hardware more easily so you can limit the number of customer complains - we all know most software runs on most computers, it's just a matter of performance - if we set our specs so that people in the minimum requirements range has reasonably acceptable performance, you limit customer complaints - and then when a customer calls in, you can use the benchmarks to determine if there really is an issue, or if the customer is just not happy with how the product is supposed to work.

5

I am convinced that having slower computer for development results in faster code, but this comes at a price. The rationale is that I have experienced this first hand: having long commute time, I bought a netbook to work in the train, netbook which is slower than any computer I have bought in the 5 last years. Because everything is so slow, I see very quickly when something is unbearably slow on this netbook, and I am aware of slow spots much more quickly (no need to benchmark all the time). Working on a netbook really changed how I developed.

That being said, I am not advocating doing this, especially in a profesional environment. First, it is demoralizing. The very fact that almost everybody said the idea did not even make sense show that programmers react badly to the idea.

Secondly, having everything slower means that things you may want to do on a fast machine (takes say 1 minute) are not really doable anymore on a slow machine, because of lazyness, etc... It is a question of incentive.

Finally: the produced code may be faster, but it almost certainly takes longer to produce.

  • +1 But I have to disagree on some points. I also bought a netbook, but I've noted that speed isn't the real problem, it's the small screen size. 1GHz is fast enough for small projects on the go, but 1024x600 is just **too** small. – Joe D Oct 23 '10 at 08:33
4

Point 1, NO! Studio is meant to be run on decent machines and that requirement has only become more powerful with each version. You can actually lock some versions of studio if you turn intellisense on and use a single core non HT box.

To point #2 there are some features in the testing projects that allow you to throttle some resources. They are not perfect, but they are there. VPC or low spec VM images to a pretty good job of being constrained as well. I have had users sit down at bad machines to do testing occasionally so that they can see the implications of the features they have requested.

Bill
  • 8,330
  • 24
  • 52
4

Nope - in fact it would result in more bugs because they won't do as much testing, and they won't use extra tools like profilers as much. Give them the best machines you can afford (including graphics acceleration hardware if you're a game development or graphics shop), and have them test inside VMs. The VM specs can be scaled up or down as needed.

JohnL
  • 1,890
  • 13
  • 14
  • +1: *in fact it would result in more bugs because they won't do as much testing, and they won't use extra tools like profilers as much.* - Great point. Let's not forget the opportunity cost associated with a slow development machine. – Jim G. Oct 23 '10 at 14:00
4

I think this is an interesting question, and I wouldn't go for a "no" that quickly. My opinion is: it depends on what kind of developing team we are talking about. Example: if you are leading a group that's running for the annual ICFP programming contest, maybe having good results after a small amount of developing time on a HPC cluster wouldn't necessarily mean that the solution you found is good. The same can be said if you are writing some scientific or numerical algorithm: on your old AMD Duron 600 MHz with 64 MB of memory you're forced to be careful about the way you are getting things done, and this may affect even some design choices.

On the other hand, a smart programmer/scientist/whatever is supposed to be careful anyways. But I found myself writing down some of my best codes when I had NO computer AT ALL and had to take notes on paper. This may not apply for big projects involving huge frameworks, when an IDE is strictly necessary.

One thing is sure: fast machines and good immediate results make (bad) programmers spoiled and may be one of the reasons for some of the crap we find on computers.

Lorenzo Stella
  • 259
  • 1
  • 4
4

I work on a package that takes about an hour to build on my 8 core 8G machine (full clean build). I also have a relatively low end laptop I test on. The low end laptop doesn't manage two full builds during a single work day.

Am I more productive on the fast machine with some deliberate testing done on the laptop, or should I do all my builds on the laptop?

Keep in mind these are not made up numbers.

It is a rigged demo in that I don't normally need to do a clean build every day (I can do a lot of testing on single modules), but even the partial builds show roughly an order of magnitude difference in compile/link times.

So the real issue is on my slower machine a typical build is long enough for me to go get a cup of coffee, while on my faster machine I can only sip a little coffee.

From a point of view of getting work done I prefer doing development on a fast machine. I can far more reliably hit deadlines. On the other hand I imagine if management made me do development on my slow machine I would get a lot more web browsing done, or at least book reading.

Stripes
  • 181
  • 2
  • Generally speaking, what's in your build to make it take so long? Is it CPU bound, or disk bound (what is the bottleneck) Would this be an issue if something like TFS did the builds for you? – makerofthings7 Oct 22 '10 at 04:42
  • 1
    It takes you half a work day to have a cup of coffee? You must be working for the government. – finnw Oct 22 '10 at 14:07
  • I/O bound on the slow machine. Still I/O bound at times on the fast machine, but more of a CPU bottleneck. The current build system doesn't like working on more then one lib at once, so some CPU and I/O is left on the floor when there are fewer then 8 files left to compile in any given subproject. As for the coffee, I could drink it faster, but I try to limit my intake, so if I drank it faster I would need another idle time activity. – Stripes Nov 29 '12 at 20:11
3

Interestingly, I worked at a startup where we ended up doing this. I think it actually worked pretty well, but only because of the specific situation we were in. It was a mod_perl shop where class auto-reloading actually worked correctly. All the developers used vim as their IDE of choice (or used some remote editing software). The end result was that very little (if any) time was lost waiting for code to compile/reload/etc.

Basically, I like this idea IFF there is a negligible impact on the development cycle for all developers, and it only impacts runtime operation of your code. If your code is in anyway compiled, preprocessed, etc, then you are adding time to "fix bug; test; next" loop that developers are working in.

From the interpersonal side, people were never forced to use the slow servers, but if you used the slow servers, you didn't have to do any of your own maintenance or setup. Also, this setup existed from the very beginning, I can't imagine trying to sell this to an established development team.

After rereading your original question, it occurs to me that one thing that perpetually terrifies me is development environments that differ from production environments. Why not use a VM for code execution that you can cripple for runtime without affecting the dev workstation? Lately, I've been using/loving VirtualBox.

user6041
  • 31
  • 1
3

I'm going to buck the trend here too.

Anecdote: I worked for a Dutch software development firm that upgraded 286 computers to 486-es (yes, I'm that old). Within weeks the performance of all of our in-house libraries dropped by 50% and bugs increased... A little research showed that people no longer thought through the code itself during the debugging process, but resorted to 'quick' successive code -> compile -> test -> fix (code) etc. cycles.

Related: when I started a subsidiary for that same company in the USA, I ended up hiring Russian programmers because they were used to PCs with fewer features/less power and were much more efficient coders.

I realize these were different times, and resources were much more scarce than they are today, but it never ceases to amaze me how, with all the progress that's been made on the hardware front, the net result seems to be that every step forward is negated by sloppier programming requiring higher minimum specs...

Hence... I feel programmers should be forced to test their applications on machines that do not exceed the 'Average Joe' computing power and hardware specs.

  • 7
    Keynote here is "test", The live system doesn't have to load a fat bloated IDE, run the back end locally rather than on dedicated hardware, run mail, office etc. You need a high end machine just to bring up the dev environment in most languages today. – Bill Leeper Oct 22 '10 at 17:00
3

Hardware is less costly than time of development.

Most bottlenecks are in the database not in the client PC, but that doesn't excuse testing on slower machines than the developer. Use testing tools to test optimization.

Brian
  • 133
  • 1
  • 6
3

Absolutely not. Give your Programmers the best laptop money can buy, a keyboard of their choice, multiple great big screens, a private office, no phone, free soft drinks, all the books they want (that are relevent), annual trips to key tech conferences and you'll get great results. Then test on upper and lower boundary hardware/software/browser/bandwidth combinations.

2

This is an interesting thought (giving devs a slow machine may lead them to optimize more).

However, the solution is framed in a better way - put the response time in the requirements for programs, and have a low-end machine available for testing.

Also, if you have a really whiz-bang compiler/language, it might be able to devise different ways to generate code and pick the best one. That would only be helped by a faster computer.

Paul Nathan
  • 8,560
  • 1
  • 33
  • 41
2

Others have responded that generally you want developers to have fast machines. I agree. Do not skip on RAM - you want as much in memory as you can - some build processes are very heavy on disk usage.

Things you might want to consider getting rid of is antivirus on build drives! That only slows down and can be an extremely strong slowing down factor.

You may want to let the developes develop on Linux if possible. The tools there are much better for all kinds of extra tasks (just grep for something in a file, etc). This also gets rid of the anti-virus.

  • Don't forget the benefit of a fast hard drive: http://www.codinghorror.com/blog/2009/10/the-state-of-solid-state-hard-drives.html – Jim G. Oct 23 '10 at 13:58
2

My Macbook Pro at work is a few years old. Between Linux and Windows(to test IE quirks) vms as well as couple of web browsers and terminals open, the OSX spinning wheel shows up a lot. Guess what I do when it spins, I sit and wait. In this case, a slow machine does slow productivity.

Thierry Lam
  • 1,108
  • 2
  • 11
  • 17
2

For many applications the issue is getting developers to test with real world data sets before they are "done." For interactive applications, a baseline test machine/VM would be required.

2

I work on a slow Windows95 machine, and it allows me efficiently to write MindForth artificial intelligence in Forth and in JavaScript.

2

Asking programmers whether programmers should get good hardware is like asking a fat man whether he likes food. I know this is the subjective exchange, but still ... is the question worth asking us? :P

That said I of course agree with the majority: NO.

Matthew Read
  • 2,001
  • 17
  • 22
2

I'm tempted to say "No" categorically, but let me share a recent experience: Someone on our project was working on some code to import data into the database. At the time he had the oldest PC in our group, maybe even the entire organization. It worked fine with VS 2008, but of course a faster machine would have made the experience better. Anyway, at one point the process he was writing bombed while testing (and that's before it was fully-featured). He ran out of memory. The process also took several hours to execute before it bombed. Keep in mind, as far as we knew, this is what the users would have had to use.

He asked for more RAM. They refused, since he was getting a newer machine in 3-4 weeks and the old one was going to be discarded.

Keep in mind that this guy's philosophy on optimization is: "We have fast machines with lots of RAM" (his and a few machines excluded, anyway), so why waste valuable programmer time optimizing? But the situation forced him to change the algorithm to be more memory-efficient so that it would run on his 2Gb machine (running XP.) A side-effect of the rewrite is that the process also ran much, much faster than it did before. Also the original version would eventually have bombed even with 4Gb when more data was being imported - it was a memory hog, plain and simple.

Soooo... While generally I'd say "No", this is a case where the developer having a less powerful machine resulted in a better optimized module, and the users will benefit as a result (since it's not a process that needs to be run very often, he initially had no intention of optimizing it either way, so they would have been stuck with the original version if the machine had had enough RAM to run a few large tests...) I can see his point, but personally I don't like the idea of users having to wait 8 hours for a process to complete, when it can run in a fraction of that time.

With that said, as a general rule programmers should have powerful machines because most development is quite intensive. However, great care should be taken to ensure that testing is done on "lowest common denominator" machines to make sure that the process doesn't bomb and that the users won't be watching paint dry all day long. But this has been said already. :)

MetalMikester
  • 575
  • 5
  • 8
2

In reading the question, and the answers, I'm kind of stunned by the vehemence of the NO case.

I've worked in software development for 25 years now, and I can say without any hesitation that programmers need a bunch of things to develop good code:

  • A REASONABLE development environment. Not dinosaur. Neither does it need to be bleeding edge. Good enough not to be frustrating.

  • A good specification (how much is done with NO written specification?)

  • Good and supportive management.

  • A sensible development schedule.

  • A good understanding of the users AND THE ENVIRONMENT the users will have.

Further, on this last point, developers need to be in the mindset of what the users will use. If the users have supercomputers and are doing atom-splitting simulations or something where performance costs a lot of money, and the calculations run for many hours, then thinking performance counts.

If the users have 286 steam powered laptops then developing and having developers do their development test on the latest 47 GHz Core i9000 is going to lead to some problems.

Those who say "give developers the best and TEST it" are partly right but this has a big MENTAL problem for the developers. They have no appreciation of the user experience until its too late - when testing fails.

When testing fails - architectures have been committed to, management have had promises made, lots of money has been spent, and then it turns into a disaster.

Developers need to think like, understand, and be in the zone of the user experience from day 1.

Those who cry "oh no it does not work like that" are talking out their whatsit. I've seen this happen, many times. The developers usual response is one of "well tell the CUSTOMERS to buy a better computer", which is effectively blaming the customer. Not good enough.

So this means that you have several problems:

  • Keep the devs happy and piss of the management, increase the chances of the project failing.

  • Use slower machines for development, with the risk of upsetting the devs, but keeping them focussed on what really matters.

  • Put 2 machines on the devs desk AND FORCE THEM TO TEST ON THE CLUNKER (which they wont do because it is beneath contempt.... but at least its very clear then if there are performance problems in test).

Remember batch systems and punch cards? People waited an hour or a day for turnaround. Stuff got done.

Remember old unix systems with 5 MHz processors? Things got done.

Techo-geeks love chasing the bleeding edge. This encourages tinkering, not thinking. Something I've had arguments about with more juniour developers over the years.... when I urge them to get fingers away from the keyboard and spend more time reading the code and thinking.

In development of code, there is no substitute for thinking.

In this case, my feeling is - figure out WHAT REALLY MATTERS. Success of the project? Is this a company making / killing exercise? If it is, you can't afford to fail. You can't afford to blow money on things that fail in test. Because test is too late in the development cycle, the impacts of failure are found too late.

[A bug found in test costs about 10x as much to fix as a bug found by a dev during development.

And a bug found in test costs about 100x as much to fix as that bug being designed out during the architectural design phase.]

If this is not a deal breaker, and you have time and money to burn, then use the bleeding edge development environment, and suffer the hell of test failures. Otherwise, find another way. Lower end h/w, or 2 machines on each desk.

quickly_now
  • 14,822
  • 1
  • 35
  • 48
  • 1
    *Techo-geeks love chasing the bleeding edge. This encourages tinkering, not thinking.*: Gosh, you sound like you'd be a blast to work for. ;) Has anyone ever accused you of being pompous, arrogant, or prone to making broad generalizations? – Jim G. Oct 25 '10 at 00:23
  • No, actually. Thanks for the slur. Appreciated. When you can't attack the argument, attack the person. – quickly_now Oct 26 '10 at 05:06
  • +1 thank you for going against the popular opinion and offering your perspective. – makerofthings7 Nov 19 '10 at 23:17
  • 1
    @Jim G - To be fair, I think @quickly_now may be referring to copy-and-paste coders, and those who don't really understand what's going on. I've seen guys bash on things (3rd party components, SQL joins, etc) with no idea where they are going and are OK checking in a solution they don't understand and can't support. – makerofthings7 Nov 19 '10 at 23:19
  • That sums it up very well. Those who can think will do so and the speed of their tools has little to do with their ability to think. Those who tinker love to get results faster: "Lets just try this" are words that make me shudder. For such people, slower tools are better: they do less damage! – quickly_now Nov 20 '10 at 06:37
1

The answer lies in the middle.

Have one fast box to run the dev environment (eg Eclipse)

And another slow box for testing the output. This is especially important for web apps.

Side-by-side screens, one for each box.

If the code is acceptable on the output box, it will be more than acceptable for most users.

Fast dev boxes make programmers lazy. For example, searching for an element in the DOM every time it's needed. Find it once and cache the result.

You'll really notice the difference on a slow box running IE 6....

  • For testing web stuff, make absolutely sure that the slow machine isn't hitting the servers on a gigahertz LAN or anything like that. If you can arrange for it to connect via TCP/IP through a proxy several hundred miles away, great. – David Thornley Dec 07 '10 at 18:32
1

This theory is simple-minded and outdated. It was true back in the days.

I remember spending more time microoptimizing my Turbo Pascal stuff on my pre-Pentium computer. It just made sense before Y2K, much less ever since. Nowadays you don't optimize for 10 year old hardware. It's sufficient to testrun software to find bottlenecks. But as everyone here agress, this doesn't mean developer (and thus optimization) productivy correlates to giving them outdated hardware for development.

mario
  • 2,263
  • 16
  • 18
1

1) If I give a developer a slower machine, does that mean that the resulting code may be faster or more efficient?

No. Good Developers are spoiled. If they see they get bad tools at your company, they will go work somewhere else. (Good developers usually have the choice to go someplace else)

1

Isn't the answer to this question a resounding "NO" independent of whomever you ask?

Ask your graphic artists if they should be given a slower machine.

Ask your writers if they would choose a slower machine over a faster one.

Ask your administrative assistants whether they would prefer a slower or faster machine.

All of them will say they'll be more productive with a faster machine.

Barry Brown
  • 4,095
  • 4
  • 25
  • 27
1

I say developers need the best development system available - but that doesn't necessarily mean the fastest. It may well mean a modern but relatively slow system with all-passive cooling, to keep noise to a minimum, for example.

One thing - a development system should be reasonably new, and should absolutely have multiple cores.

An old PC may sound attractive in a show-performance-issues-early kind of way, but a Pentium 4, for example, may actually be faster (per core) than some current chips. What that means is that by limiting a developer to using a P4 system (actually what I'm using now - though that's my personal budgeting issue)...

  1. You encourage the development of non-concurrent software that will not benefit from the current mainstream multi-core systems.
  2. Even if multi-thread software is developed, bugs may not be noticed (or at least not noticed early) because concurrency-related issues may not show up in testing on a single-core system.
  3. Multi-threaded software can cause serious performance issues that may get much worse with multi-core processors. One would be causing disk head thrashing (which can result in many thousands of times slower access to data) where individual threads are doing sequential access, but each to a different part of the disk. This can even go away on older slower PCs, by e.g. having two old 160GB drives instead of one 1TB drive, those threads may no longer be fighting each other for access to the same disk.

There are also issues with PCs that are too limited to support virtual machines well - e.g. for testing in multiple platforms.

0

The run-time speed on developer machine is so irrelevant, unless you want to revenge or punish your developer for writing slow code and for ignorance of target deployment environment.

As the manager, you should make sure the developers knows the objective of the project and always ensure they are on track. About the target machine issue we are discussing, it could be prevented by early and frequently testing on slow machine, not by giving them slow machine to use and be suffering.

The slow run-time speed also slow down development, as most programmers are using code-and-test method. If the run-time is slow, their task will be slow too.

tia
  • 965
  • 5
  • 9
0

I can only imagine the profile experience while using a slow machine. Yikes.

In short. Hell No.

Also have at least 4gb of ram so you can have 2gb on your main machine, 1 for a VM and the other 1 for the extra memory the VM needs and for you to have memory leeway.

Also two processors are a must so if an app locks/eats CPU up the developer doesnt have to painfully way to ctrl-alt-del something.

0

Let's go against the flow here: YES. Or at least that's been the general wisdom in the industry for decades (except of course among developers, who always get angry when they aren't treated like royalty and get the latest gadgets and computers).

Of course there's a point where reducing the developer's machine will become detrimental to his work performance, as it becomes too slow to run the applications he needs to run to get his job done. But that point is a long way down the line from a $10000+ computer with 6GB RAM, 2 4GB videocards, a high end soundcard, 4 screens, etc. etc.

I've on the job never had a high end machine, and it's never slowed me down considerably as long as it was decent (and the few real sub-standard machines were quickly replaced when I showed how they slowed me down).

jwenting
  • 9,783
  • 3
  • 28
  • 45
-1

That programmers sitting on slow hardware would write faster applications is equivalent to arguing that race car engineers equipped with crappy tools would make faster vehicles.

gablin
  • 17,377
  • 22
  • 89
  • 138
-1

Yes, of course! And making them working using only a sheet of paper and a pencil will effect in ever more efficient code (and obviously more portable). Of course, only if pencils aren't too sharp.

ts01
  • 1,171
  • 10
  • 17
-1

Ask the client if they would create more efficient business processes using slow PCs.

-1

Boy I'll get clobbered for this, but there's something people don't want to hear:

Nature abhors a vacuum.

Of course programmers want faster machines (me included), and some will threaten to quit if they don't get it. However:

  • If there's more cycles to be taken, then they get taken.

  • If there's more disk or RAM to fill up, it gets filled up.

  • If the compiler can compile more code in the same time, then more code will be given to it.

If it is assumed that the extra cycles, storage, and code all serve to further gratify the end user, one may be permitted to doubt.

As far as performance tuning goes, just as people put in logic bugs when they program, they also put in performance bugs. The difference is, they take out the logic bugs, but not the performance bugs, if their machine is so fast they don't notice.

So, there can be happy users, or happy developers, but it's hard to have both.

Mike Dunlavey
  • 12,815
  • 2
  • 35
  • 58
  • To quote the [top-voted answer](http://programmers.stackexchange.com/questions/13623/does-giving-a-developer-a-slower-development-machine-result-in-faster-more-effici/13625#13625): _"There's emulators, virtual machines, actual machines with testers that can all test performance to see if it's a factor."_ – Peter Boughton Nov 19 '10 at 19:48
  • 1
    @Peter: Sure there are in principal, except where I look. We even have test machines and virtual machines, and if a tester says something is taking too long, what do they say? "This blasted virtual machine is too slow!" I think the coder him/herself has to feel the pain, and needs an IDE to do something about it. – Mike Dunlavey Nov 20 '10 at 00:54
  • @Peter: I see this all the time, where folks seem to think they need to have code as optimized as possible, and run on as fast a machine as possible, before they can profile it to find the performance problems. Where that idea came from, I can't even guess. – Mike Dunlavey Nov 20 '10 at 00:57
-2

THIS IS THE MOST DISGUSTING THING I HAVE EVER READ... I think giving your developer a 2 leged stool to sit on would have the same desired effect... "he will work for you, and when you are not looking, seek other employment"

Frank
  • 109
  • 1