287

I must be missing something.

The cost of employing a programmer in my area is $50 to $100 an hour. A top end machine is only $3,000, so the cost of buying a truly great computer every three years comes to $0.50/hour. ($3000/(150 wks * 40 hours))

Do you need a top-end machine? No, the $3000 here is to represent the most that could possibly be spent not the amount that I would expect. That's roughly the cost of a top-end iMac or MacBook (17 inch).

So suppose you can save $2000 every three years by buying cheaper computers, and your average developer is making $60. (These are the most charitable numbers that I can offer the bean-counters. If you only save $1000, or $750, it only strengthens my case.) If those cheaper computers only cost you 10 minutes of productivity a day. (Not at all a stretch, I'm sure that my machine costs me more than that.) then over 3 years the 125 lost hours would add up to a loss of $7500. A loss of 1 minute a day ($750) would give a net gain of $1250, which would hardly offset the cost of poor morale.

Is this a case of "penny-wise and pound-foolish" or have I oversimplified the question? Why isn't there universal agreement (even in the 'enterprise') that software developers should have great hardware?

Edit: I should clarify that I'm not talking about a desire for screaming fast performance that would make my friends envious, and/or a SSD. I'm talking about machines with too little RAM to handle their regular workload, which leads to freezing, rebooting, and (no exaggeration) approximately 20 minutes to boot and open the typical applications on a normal Monday. (I don't shut down except for weekends.)

I'm actually slated to get a new machine soon, and it will improve things somewhat. (I'll be going from 2GB to 3GB RAM, here in 2011.) But since the new machine is mediocre by current standards, it is reasonable to expect that it will also be unacceptable before its retirement date.

Wait! before you answer or comment:

  1. $3000 doesn't matter. If the machine you want costs less than that, that's all the more reason that it should have been purchased.
  2. I'm not asking for more frequent upgrades. Just better hardware on the same schedule. So there is no hidden cost of installation, etc.
  3. Please don't discuss the difference between bleeding edge hardware and very good hardware. I'm lobbying for very good hardware, as in a machine that is, at worst, one of the best machines made three years ago.
  4. $50 - $100 / hour is an estimate of employment cost - not salary. If you work as a contractor it would be the billing rate the contracting agency uses which includes their expenses and profit, the employers Social Sec. contribution, employers health care contribution etc. Please don't comment on this number unless you know it to be unrealistic.
  5. Make sure you are providing new content. Read all answers before providing another one.
Eric Wilson
  • 12,071
  • 9
  • 50
  • 77
  • 14
    Maybe they do, but not as often as you'd like? Any workstation you buy will only be "the best" for 6 months, at best. Usually a better model comes out the next quarter. To *always* have the best, you'd have to upgrade every 3-5 months. That's hard to maintain. – FrustratedWithFormsDesigner Jul 18 '11 at 20:01
  • 1
    @FrustratedWithFormsDesigner I should have clarified that my ideal would be a purchase of close-to-top-end every three years -- I sort of implied that in my calculation. – Eric Wilson Jul 18 '11 at 20:13
  • 11
    There's a human factor, too. Buy a fast machine and gain all of that productivity, then spend 10 minutes per day at the water cooler and lose it all and then some. The boss sees both sides, so the pure productivity argument loses some weight. – JeffK Jul 18 '11 at 21:39
  • Most companies suck at true cost analysis; their competitors do as well though. Hardware is just one aspect of this. http://www.joelonsoftware.com/articles/FieldGuidetoDevelopers.html – Job Jul 18 '11 at 23:04
  • 4
    I definitely know I could use a little more punch in my machine. Not so much CPU power but RAM. Between running multiple instances of an IDE, browsers, and misc other programs another 4GB and a second monitor wouldn't hurt... – Rig Jul 19 '11 at 00:40
  • 24
    A developer without an SSD is a sad sight indeed... – ShaneC Jul 19 '11 at 02:25
  • Everybody tempted to comment on this question and either pick apart the arbitrary $3000 figure or get into an argument about environmental aspects of computing, please use our awesome [chat](http://chat.stackexchange.com/rooms/21) instead. If you have an answer for this question, please post it as an answer, or upvote some of the existing great answers. Irrelevant comments left here will be removed. Thanks. – Adam Lear Jul 19 '11 at 14:39
  • In almost all the small companies I've worked for, the C-level executives and VP level always received the highest end, newest machines. Technical consultants were given the slowest (usually the hand-me-downs from the CEO 2 machine-revisions ago). Moronic! – David Hall Jul 19 '11 at 15:57
  • 2
    Well, your head is processing at something like 50 petaflops, so why can't you just compile in your head? If you taped 5 heads together you would be able to predict the future – Dan Jul 19 '11 at 16:54
  • 3
    @ShaneC: I've never had an SSD, or felt like I needed one, to be honest. I guess I'm just spoiled by Delphi, which has insanely fast compile times. – Mason Wheeler Jul 19 '11 at 17:26
  • 1
    @Mason SSDs are incredible, especially for laptops which typically have slower drives. It's not just compiling...it's everything. – Michael Haren Jul 19 '11 at 18:13
  • 9
    We spend 4-5k on average for a dev setup here at SE ... – Zypher Jul 19 '11 at 18:26
  • @Zypher what is the planned service life? The specs of that setup might make an interesting post on the SF blog. – Eric Wilson Jul 19 '11 at 18:43
  • @FarmBoy until they start complaining they are too slow or 3-5 years whichever comes first. The specs are fluid, but the philosophy might make an interesting post – Zypher Jul 19 '11 at 18:47
  • I spent about $4000 for my home development machine, a Mac Pro with dual Xeons. My gaming computer cost about half that. – Robert S. Jul 19 '11 at 20:45
  • I was lucky, my boss just asked me about the percentage my productivity will increase when I get the second display I was asking for. I said something around 10% (when I remember correctly) and it was ok for him. I got 2 new displays because the old one was 15" but the new one where 19" (5 years ago)... +1 for this great question – WarrenFaith Jul 20 '11 at 07:07
  • 3
    @JeffK It's true that employees are not working 100% of the time they are at work, but your argument only holds up if the employees actually increase their time at the water cooler when their machines are better. My guess is that if there is any difference, it is in the opposite direction. Quality tools are a joy to use. Crappy tools stress me out and make me want to get some water. More to the point is this: How productive are programmers while they are at their workstation? And how valued do they feel? – PeterAllenWebb Jul 20 '11 at 16:34
  • You're not asking the right question. There are limitless things a business can spend money on. The decision process isn't "Will buying new computers help?", it's "Would spending $100k on hardware help **more** than spending $100k on X?" Where X could be furniture, office space, hiring, raises, IT etc... Having a better parking space can save you 10mins a day as well. – Ryan Jul 20 '11 at 20:08
  • For build machines that might be okay, but workstations, honestly? ... come on folks, be a bit modest. You think the users for which you write your software have the latest and greatest hardware? Could you perhaps notice inefficiencies in your code without a profiler on a somewhat slower machine? A good screen and keyboard are much more important to me than the latest an greatest CPU. After all I have to type on that keyboard every day. But I do not have to run a full rebuild several times a day, especially because dependency scanning makes rebuilds cheap - even without SSD. – 0xC0000022L Jul 20 '11 at 23:21
  • 1
    Sometimes I wonder if IT buys stuff off the back of the truck, because they got 40 workstations at $100/each. – Warren P Jul 20 '11 at 23:40
  • @STATUS_ACCESS_DENIED I've said some of this before, but there are too many words, so it needs to be repeated. I asked this question not because I want a SSD or the fastest processor, but because I've seen a tendency for companies to give devs machines that are dog-slow, that freeze up and need rebooted, and generally waste the companies money. As for users, I don't write software that depends on a users machine right now, and most devs aren't writing desktop apps anyway. – Eric Wilson Jul 20 '11 at 23:52
  • 50-100/hour? wow, where do you work? – Louis Rhys Jul 21 '11 at 05:33
  • 1
    @Louis Rhys: He's talking about employment cost. Employing someone costs more than just their salary. – pyvi Jul 21 '11 at 06:15
  • 1
    @FarmBoy: point taken. Although it means the case could be extended to software tools required for development. I've had numerous instances where a third-party tool could have saved so many hours (repeatedly) and yet the misconception seems to be that we have "developers" (as if there are no differences) in-house who should do this or that, distracting them from their main task at best, creating a bad make-shift solution in the worst case, though. But I get your point better now, I guess. I got hung up on the hardware notion too much. Sorry about that. – 0xC0000022L Jul 21 '11 at 16:45
  • @Zephyr - 4-5k sounds a lot, but I guess by the time you've taken off the cost of the Aeron chair, motorised desk, dual 30" screens and 240GB SSD, you've really had to cut back on the spec of the actual PC. *8') – Mark Booth Jul 25 '11 at 14:48
  • @farmboy: http://blog.serverfault.com/post/making-devs-happy-with-hardware/ :) – Zypher Jul 27 '11 at 19:01
  • @Zypher Thanks much. May you also get 30k page views. – Eric Wilson Jul 27 '11 at 19:08

39 Answers39

224

Many companies are certifiably insane around this.

Seriously. If you asked 10,000 tech mangers, "Let's say you paid Danica Patrick $100,000,000. Do you think she could win the Indianapolis 500 by riding a bicycle?", I'm sure not one of them would say, "Yes."

And yet a good percentage of these same managers seem to think that highly-paid software developers ought to be just as productive with crappy tools and working conditions as they are with good ones - because, of course, those lazy, feckless programmers are getting paid lots of money and ought to be able to pedal that bicycle faster.

Now, what exactly good tools and working conditions consist of depends on the job to be done. People who code the Linux kernel need different kinds of hardware than web site designers. But if the company can afford it, it's crazy not to get people what they need to be as productive as possible.

One company I worked for had a 9 GB source code base, primarily in C, and the thing we most needed were fast builds. Unfortunately, we were mostly working with hardware that had been mediocre five years before, so people were understandably reluctant to build much other than what they were working on at the moment, and that took its toll via low productivity, quality problems, and broken builds. The company had money to upgrade the hardware, but was strangely stingy about it. They went out of business last summer after blowing through over $100 million because their two biggest clients dropped them after repeatedly missed deadlines. We were asked one time to suggest ways to improve productivity; I presented the same kind of cost-benefit analysis the OP did. It was rejected because management said, "This must be wrong - we can't possibly be that stupid", but the numbers didn't lie.

Another company I worked for had fine computers for the programmers, but insisted everybody work at little tiny desks in a big crowded bullpen with no partitions. That was a problem because a lot of us were working with delicate prototype hardware. There was little room to put it on our desks, and people would walk by, brush it, and knock it on the floor. They also blew through $47 million in VC money and had nothing to show for it.

I'm not saying bad tools and working conditions alone killed those companies. But I am saying paying somebody a lot of money and then expecting them to be productive with bad tools and working conditions is a "canary in the coal mine" for a basically irrational approach to business that's likely to end in tears.


In my experience, the single biggest productivity killer for programmers is getting distracted. For people like me who work mainly with compiled languages, a huge temptation for that is slow builds.

When I hit the "build and run" button, if I know I'll be testing in five seconds, I can zone out. If I know it will be five minutes, I can set myself a timer and do something else, and when the timer goes off I can start testing.

But somewhere in the middle is the evil ditch of boredom-leading-to-time-wasting-activities, like reading blogs and P.SE. At the rates I charge as a consultant, it's worth it for me to throw money at hardware with prodigious specs to keep me out of that ditch. And I daresay it would be worth it for a lot of companies, too. It's just human nature, and I find it much more useful to accept and adapt to normal weaknesses common to all primates than to expect superhuman self-control.

Bob Murphy
  • 16,028
  • 3
  • 51
  • 77
  • 55
    +1 for the mentioning the zone. I once worked for a company where it was common that developers also did direct customer support . Now, even if you are writing highly maintainable and really good code, sometimes there are moments where you juggle like five or six information packages in your brain, and you must put those down again atomically. If a call comes in in such moments 3 hours before leaving home, it can really destroy your rest of the day. Not specifically because of the guy on the other side, but because of the state-destruction. ... – phresnel Jul 19 '11 at 11:55
  • 1
    ... You get demotivated, and you have a hard time to rollback your mindstate to the moment before you started increasing brain-state (because of demotivation and because your brain is already tired of the day). You know that stuff. As always, management insisted on this support ... – phresnel Jul 19 '11 at 11:57
  • 34
    But the managers don't think of you as Danica Patrick, they think of you as the UPS delivery guy, and why do you need a new truck when the 5-year-old truck runs just fine? – Mark Ransom Jul 19 '11 at 15:36
  • 19
    "This must be wrong - we can't possibly be that stupid" err, guess again! :-D – Nowhere man Jul 19 '11 at 15:48
  • 15
    @Mark Ransom: All too true - and it's worse, because we're salaried. UPS drivers get paid extra for working overtime. Lots of them love the holidays: exhaustion, but paycheck happy-time! But programmers' overtime is free for their employers. If tech companies had to pay programmers time-and-a-half for work beyond forty hours in a week, we'd all have screamin' machines and interns to bring us coffee in our cubes. – Bob Murphy Jul 19 '11 at 16:07
  • 3
    @Nowhere man: Yep, like I said, it was a "canary in the coal mine" for irrational management. Another was thinking, "They're just a bunch of ignorant programmers, so we can flim-flam them at the quarterly company meeting with fancy slides." What they should have been thinking was, "We have a lot of employees who've worked for startups and public companies. Some of them are probably reading our required SEC 10-Q reports. So maybe we should address the fact that we're losing a crapload of money." – Bob Murphy Jul 19 '11 at 16:16
  • 5
    @Bob Murphy "But programmers' overtime is free for their employers." This is only true if you aren't willing to draw lines, and only if you aren't willing to demand a salary commensurate with what you bring to the table. – PeterAllenWebb Jul 20 '11 at 16:02
  • 2
    +1 for "In my experience, the single biggest productivity killer for programmers is getting distracted." I can't begin to tell you how all the various meetings and interruptions keep me out of "the zone" and keep me from getting things done. – Robusto Jul 24 '11 at 20:31
  • Danica Patrick is a very capable woman. I'm sure she could if she put her mind to it. – Thomas Boxley Jul 30 '11 at 09:30
  • 1
    @Thomas Boxley: You have a great career ahead of you in management. :-) – Bob Murphy Jul 30 '11 at 16:58
  • 1
    If only it was just the computer ... try and explain to a manager that a razer mouse is more than worth it for any programmer who knows what a mouse is (i.e. one who has been gaming at some point)... Try to tell them you need two full HD led screens that will not destroy your eyes in less than a year working ... Try to explain why working in an open space with project managers is completely retarded since you actually need to think straight at the same problem for hours on end. and then you can start explaining how Oxygen plays a major role in intellectual activity (not management). – Morg. Oct 12 '11 at 09:16
170

I would suggest that, in reality, one cost is visible and quantifiable, while the other cost is neither.

If failing to upgrade the hardware bleeds even as much as $1000 per developer per week from the budget, no one outside (read: above) the tech department ever sees that. Work still gets done, just at a slower rate. Even in the tech department, calculating that figure is based on numerous unprovable assumptions.

But if a development manager asks for $3000 per developer, particularly in a company with 50+ developers, then this takes a lot of justification. How does he do that?

pdr
  • 53,387
  • 14
  • 137
  • 224
  • 3
    Very good point. – Eric Wilson Jul 18 '11 at 20:11
  • http://en.wikipedia.org/wiki/Total_cost_of_ownership – Henrik P. Hessel Jul 19 '11 at 06:28
  • 6
    If the manager has to ask for $3000 per developer, yes, that's painful. However, if he can ask for $83 per developer per month, that might be more palatable. – regularfry Jul 19 '11 at 09:30
  • 24
    I think it is the manager's responsibility to justify the cost of adequate machines for his/her team. In the past I have found it useful to categorize computers according to roles. Computers used by developers and designers get classed as "for content creation". You just list invariably beefy application requirements for your shop's IDE along with some overhead and make a short list of acceptable machines from HP,Lenovo,etc. If this is not accepted and the team ends up with ridiculously under-performing hardware, the manager should really shoulder the blame for failing justify better machines. – Angelo Jul 19 '11 at 12:18
  • 8
    If the manager staggers his requests, (50 / 3 = 17) which is (17 * $3,000 = $51,000), to stretch the requests over three years as not every developer needs the new machine at the same time which leaves just under 17 requests per year and then again divides those requests up by month, (17 / 12 = 1.6 which rounded is two or two each month for the first quarter and one each month thereafter at 2 * $3,000 = $6,000), he/she will have less than two computers per month which is a much more attainable goal than asking for, (50 * $3,000 = $150,000), at once. – Michael Eakins Jul 19 '11 at 12:58
  • 1
    @Angelo - Would agree with that to some extent, though there are definitely extenuating circumstances. I'm involved in one situation where the previous manager convinced the business to spend a large amount of money on upgrading hardware but didn't go far enough. The fact that this hardware hasn't produced massive gains in productivity has made it very difficult for the current manager to make the argument for spending much more. I do think that regularfry and Michael make a good argument for gradual improvement and I'm hopeful the manager in question will go that way. – pdr Jul 19 '11 at 12:58
  • 2
    50 developers also should bring 50 times more profit than one – mjn Jul 19 '11 at 14:04
  • 1
    Perhaps identifying the hardware processing bottleneck(s) once every month and upgrading those elements (easier in a desktop) could realize the goal of consistent adequate performance. – Dale Jul 19 '11 at 14:28
  • 13
    Many megacorps are such that dev time is wasted for much sillier reasons (like poor allocation of workload) -- so this is not a surprise to me at all. – singpolyma Jul 19 '11 at 14:34
  • @Michael Eakins, often the request must be part of a yearly budget so there's no way to soften the blow. – Mark Ransom Jul 19 '11 at 15:41
  • While this (at the moment) is the most popular answer, then it is incorrect in as much as it fails to point out that any company budget differentiate between Capex and Opex expenses. An with Capex expenses being a upfront and with zero immediate ROI it is always subject to higher scrutiny. However failing to provide adequate tools is not sensible either. – Soren Jul 19 '11 at 16:59
  • 1
    @regularfry -- You cannot get Computers for $83 per month, unless you lease them, hence this is why there is a difference between CAPEX and OPEX in budgets. Most companies still buys machiens, and if they do it is a upfront CAPEX cost and not a monthly OPEX. – Soren Jul 19 '11 at 17:02
  • @Soren - I don't think I argued it was a sensible decision, I argued that this was one reason for the decision. As for whether it is sensible, that depends on your perspective. No one ever got fired for failing to justify a visible expenditure to reduce a hidden one. Is it good for the business? No. Is it safe for the manager making the decision? Yes. So in that sense, it is sensible. – pdr Jul 19 '11 at 18:38
  • As far as I am concerned, your first sentence is the complete and definitive answer to the question. There is no need for elaboration. – PeterAllenWebb Jul 20 '11 at 16:03
  • @Soren: if we want to get into accounting shenanigans, there are all *sorts* of ways to make a $3000 machine not appear as a $3000 cost on the balance sheet. To be honest, I'm not even sure I understand why a resource which is renewed every three years is counted as capital expenditure in the first place. – regularfry Jul 21 '11 at 09:13
  • @regularfry -- all Hardware purchases are capex expenditures which are depreciated over 3 years -- this is typically why you can get a new machine every 2-3 years, because at that point the accounting value is written down to zero -- this is just accepted accounting practices. Whether that is sensible or shenanigans, then the other part of Capex cost is that they are up-front cost which need to stay on the company accounting books for 3 years, unlike "you" who can be terminated at any time. Hence with CAPEX there is a fixed 3 year commitment, for OpEX (your salary) not so much. – Soren Jul 21 '11 at 14:42
  • @mjn : the more devs you have, the less productive they each become. it's called management overhead ;) - it's the same for any type of work, if you have 50 workers in a factory, they are less efficient than if you had 10, which is why the correct way to manage all that is to cut operational teams in small independent blocks (just as if it were different companies) with defined objectives, while keeping all under one big banner for economics of scale. – Morg. Oct 12 '11 at 09:38
  • Anandtech has come up with some interesting ways of quantifying performance and cost. http://www.anandtech.com/bench/CPU/2 – T. Webster Dec 03 '11 at 03:42
94

I will put my 2 cents in here from the employer's side ... who is also a developer.

I agree that low end machines are useless but top end machines are overkill.

There are a number of reasons why you don't get the top end machines:

  1. Cashflow is a real issue, not just a theory. You might be getting paid $60K-$80K per year, but this month we have a total amount in the bank which has to be split amongst every competing thing in that month.
  2. There is a sliding scale of price and benefit. Low end machines are on the whole pretty useless ... if you're getting a celeron or low power chip then whinge away ... mid range machines have good overall performance, once you get into the top you are starting to tune for a given purpose (CAD, Gaming, Video encoding etc) ... and the tuning costs extra.
  3. General parts are generally cheaper, replacements, warranties and insurance all play a part in the overall running costs and the down time while you source a replacement.
  4. Top end machines depreciate just faster than ones 1/3 the price.
  5. If you're doing high end graphics programming or CAD work then the extra grunt is valid; if you're just writing standard business software, running visual studio or eclipse and surfing Stackoverflow for answers then the extra power is cool bragging rights, but realistically a mid range machine will not max out the CPU or memory in a standard box today.
  6. Mid range machines built today hammer and in 2 years time they will be twice as fast (well kind of). Seriously, they are lighting quick.
  7. At the end of the day most of what you do is type raw text into text files and send it to the compiler ... that bit really hasn't changed since VI in the 1970s and the low end machines today are a millions times faster than the ones back then ... your pace of coding really isn't that different.

SO to summarize, you should have good gear and good tooling, it makes a big difference but top end machines are not really justifiable for the "general developer".

... ah, and now I read you edit and that is what you are talking about, I will leave the above cos I have written it now ... Yeah, your machine is underspecced for the tooling.

To clarify a mid range machine should have

  • 2 cores min, 4 cores good anymore at this stage is overkill.
  • 4GB is a min, 8GB is good and anymore is nice to have.
  • SSD should be standard but really a 10KRPM WD or seagate 80-100GB drive should do fine.
  • 2 x 19" monitors is a minimum with a reasonable video card.
Manbeardo
  • 103
  • 1
Robin Vessey
  • 1,577
  • 11
  • 14
  • 24
    my machine fails all 4 of your bullet points - i had to beg to go from 512 to 1 gig of RAM for example. We are not all just whining about not having the latest alienware setup with cool LEDs and diamond plate. – Peter Recore Jul 19 '11 at 04:11
  • 1
    I'm not sure what you do, but for the kinds of stuff I compile, dual 6-core Xeons with hyperthreading wouldn't be out of the question. If I touch certain header files, I might as well go get coffee otherwise. – Bob Murphy Jul 19 '11 at 05:59
  • @Bob, possibly, but we have build servers and break up the components into bite sized chunks so we don't have to wait that long ... still if you have that much grunt, why the hell not ... see my comment about valid uses for high end machines (apart from they are cool) – Robin Vessey Jul 19 '11 at 07:37
  • 23
    "your pace of coding really isn't that different", that might well be true (if we ignore today's tools being huge resource hogs compared to back then), but I think it's fairly safe to say that what most developers gripe about isn't *pace of coding*, but *turnaround time*: how long does it take to make a change and see the effects of it in the running application? If the turnaround time from hitting run to seeing the change in action is 10-15 seconds, that's a completely different beast than, say, 5-10 minutes. Yet the amount of time spent coding can be essentially the same. – user Jul 19 '11 at 07:43
  • 55
    If only I had at work machine with your 'mid range' spec. – yoosiba Jul 19 '11 at 09:41
  • 1
    Most developers do not simply type text into a text editor and give it to a compiler. If that's all your developers do then I'd hate to use the software they produce! Sure that's part of it, but I wouldn't say that is most of what I do as a developer. Also, the text editors are generally now rich IDEs and not a console with vi. –  Jul 19 '11 at 11:12
  • @fwgx: For me, last time stuff like Intellisense (R) and other nifty things involved performance problems was at the 1.5 GHz edge. Beyond that, I never found that machinery increasement gave signifantly better performance in all day apps-development. At home, where I do graphics programming, it does of course. Even on a netbook, I feel not much less productive (if at all, then because of the lack of some keys). – phresnel Jul 19 '11 at 12:05
  • 28
    FWIW, a lot of companies would consider your mid-range machine to be server class hardware! I am fortunate in that I do work for a place where we get these specs, but not everyone does. – Paul Wagland Jul 19 '11 at 12:12
  • +1 to Paul Wagland's comment about your mid-range machines. I work for one of the biggest software companies in the world, and our developer machines are not as good as your mid-range definition. – David G Jul 19 '11 at 12:54
  • 4
    @Bob Murphy: You really need IncrediBuild or a similar distributed compilation setup. It's far easier to justify a 12 core build server with 16 GB as a shared resource, if only because there's no personal jealousy involved in shared resources (plus, you typically pay servers from different budgets) – MSalters Jul 19 '11 at 14:27
  • 1
    This answer is the most correct answer. As an engineer who is also responsible for budgets; There is a difference between Capex and Opex -- Capex need to be paid up front, where Opex (which include your salary) is paid over the year. Upfront Capex costs are typically "bad" as they provide nothing to show for in terms of Return on Investment, and hence there is a natural tendency to minimize this. However minimizing it so the Opex becomes ineffective (i.e. you) is just bad management. Incidental, shift of Capex to Opex is also why Cloud services like Amazon become popular. – Soren Jul 19 '11 at 16:54
  • 2
    Last year I bought an Acer at Fry's with 4 cores, 8GB ram, DVD RW (which I have never used) and a 750GB HD. I also bought an acer 21 inch monitor that does 1080 resolution. total cost, with 10% california state tax: $800. – Christopher Mahan Jul 19 '11 at 17:56
  • 3
    I just designed a developer machine for the small startup I work for, and I can vouch for @Christopher Mahan. The machine is great and only cost about $800 from Newegg. I even put it together myself, which I enjoyed. If only they would purchase computers for the rest of the developers. It really affects morale more than most people think. Enough to warrant an $800 investment in your employees. – Austyn Mahoney Jul 19 '11 at 21:47
  • In a company I worked previously we had the same issue with tightfisted manager, it ended that people would buy privately parts and put them into their office machines. It was just not worth the effort to argue about it. ;-) – AndersK Jul 20 '11 at 13:00
  • Good points all around, though I would point out some business application setups require more power (RAD-WAS, local DB, etc). – C. Ross Aug 02 '11 at 16:25
  • Two 19" monitors? 24" widescreen monitors from Dell can run you as little as $200 apiece. It's like having three or four monitors because I can tile things on the left or right side. – Kyralessa Nov 13 '11 at 01:32
  • Yeah, today that is true, over the last 6 years, not so much, they were more $400-$800AUD which is out of the price range. – Robin Vessey Nov 13 '11 at 22:21
56

The difference of productivity between the "top-end" machines and "almost top-end" machines is negligible. The difference in price is significant.

Not to mention the IT support for different machines instead of having all the developers using the same HW and SW images (which you can't do if you're buying a top-end machine for every new hire, the top-end will be different every time). Also, people who got the last year's top-end will want to upgrade because that newbie next cube has a "better" machine than them, and they're oh so much more important, aren't they?

Unless you really need the top-end machine for your work, I see no reason why to throw away the money.

littleadv
  • 4,704
  • 27
  • 26
  • 12
    But the difference is cost is _more_ negligable. And my productivity takes a real hit when I have to close everything and reboot, which happens several times a week. If you have a different view of the relative costs, perhaps you could include numbers in your answer. Nevertheless, I agree that almost-top-end would be very satisfactory, wish I had that. – Eric Wilson Jul 18 '11 at 19:42
  • 8
    in the same direction the difference between almost top end and middle pack hardware is enormous and the difference in price is negligible. There is some amortization to be done on hardware that's for sure or we are just throwing money out the window, then again... in the case of devs, too much amortization also amounts to throwing money out the window ! There is a sweet spot to be attained and when taking into account the psychological aspects of keeping devs happy it will tend to be closer to high end than to mid pack – Newtopian Jul 18 '11 at 19:44
  • 24
    @FarmBoy If **your** productivity takes a real hit - go to your boss and justify an upgrade. You asked a *general* question, and my answer is for a *general* case. – littleadv Jul 18 '11 at 19:45
  • 8
    The cost of support for a wide variety of machines is incredible. Individual users tend to over look this (and they should, it isn't their job), but I've been at three companies that all came to the same conclusion. Cheap desktops + high end VM servers makes the most sense. – Christopher Bibbs Jul 18 '11 at 19:51
  • Yeah the difference for your productivity really depends on context. Like, when doing intense graphics work the extra horsepower would really help, but for web development it really doesn't matter if you have a top-end machine. Now your *monitors* on the other hand, that can have a big affect on productivity for all work. – jhocking Jul 18 '11 at 19:52
  • 1
    @ChristopherBibbs your comment is the most useful so far, if it were an answer I would accept it. – Eric Wilson Jul 18 '11 at 19:54
  • 1
    @FarmBoy: Having to stop everything and reboot is more likely a software problem than a hardware issue. Having 8 GB of RAM might hide the problem for a longer time I guess. – Zan Lynx Jul 18 '11 at 20:14
  • @ZanLynx you may be right, and it may be that this is more about the required bloated enterprise software. – Eric Wilson Jul 18 '11 at 20:18
  • 1
    @ChristopherBibbs for routine office tasks, you're correct, but for software development, slowing down the expensive dev work because of support costs is rarely a profitable proposition. Besides, most devs run as admin and set up their own environment anyway. – dbkk Jul 19 '11 at 11:28
  • 9
    This is a strawman; nobody's talking about top-end vs. near top-end. In my experience, it's between good vs. ridiculously insufficient. – niXar Jul 19 '11 at 11:50
  • @dbkk All of the companies I worked at were ISVs and the analysis was for developers. Nothing about being a developer and running your OS in admin mode prevents you from using a good hosted VM. For $20,000 you can build a decent VM server that will host 10 developers quite well. 10 high end workstations will cost you well over that. The economies of scale kick in when you have to equip 50-100 developers and you can have monster servers. – Christopher Bibbs Jul 19 '11 at 13:28
  • @niXar: I agree. My work machine is a Dell D-series laptop with 2GB of RAM, full of enterprise bloatware. Even a middle-of-the-road desktop would be a large improvement. – Joshua Smith Jul 19 '11 at 13:36
  • 2
    Supporting different machines is easily made a non-issue. Every other year, on Jan. 1st - everyone gets the latest and greatest. The old ones go to customer support for answering emails, and/or charity. IT should be happy to get new training on a regular basis to keep up with the changes. And the cost for replacing 10 laptops is still less than the cost of recruiting a single good replacement developer for the guy that leaves due to frustration. – Matt Van Horn Jul 19 '11 at 13:45
  • @Matt Van Horn: What if a new employee comes along and needs hw? – Jonta Jul 20 '11 at 10:52
  • @Zan Lynx: many strange things happen on old hardware if the RAM modules start to fail. Frequent resets are a common symptom of such thing – Fabricio Araujo Jul 23 '11 at 19:24
  • @Zan Lynx: (continuing) hardware failures can go exotic and create heisenbugs from the blue: Marco Cantù (distincted member of Delphi developer community) have saw this by himself in http://blog.marcocantu.com/blog/unbelievable_memory_failures.html – Fabricio Araujo Jul 23 '11 at 19:35
  • "The difference of productivity between...machines is negligible. The difference in price is significant." What is the value if you were to quantify this *negligible difference*? -10 min/day ? Suppose the top-end machine is $3000, and assume that almost-top-end is almost-$3000, or $2500. The difference is $500 a fixed cost distributed over 2-3 yrs. Is this your definition of *significant*? – T. Webster Dec 03 '11 at 04:44
  • @T.Webster - I spend much more than 10 minutes a day on coffee/restroom breaks (some of them combined with my machine compiling the code in the mean time), so this difference is **really** negligible - the time wasted would have still been wasted. $500 over 3 years is half a million dollars (over three years) for my company. Considering the fact that the company is still in the red - **yes, this is significant**. It's 2 more experienced developers that we could hire for these 3 years. – littleadv Dec 03 '11 at 06:22
27

Because most employers do not understand how developers think, act or work. Or, how top tools can save the company money while increasing productivity. This leads to the loss of a point on the Joel Test, failure to provide "the best tools money can buy". This also leads to loss in productivity and job satisfaction. Thats just the way it is. Maybe one day you can start your own company and score 13/13. Until then, ask questions up front with your employer so you know what to expect before ever taking the job.

As far as your current situation, if you feel they listen and trust you then bring up the discussion. See if they will give you an upgrade. I know I'd work a little bit longer if I had a top of the line rig with dual 50" monitors to work with. Stick me in the matrix.

Same reason people want a Mercedes CLS when a Toyota Camry gets you there just the same. Sure, you may only squeeze a few more seconds of compile time out with a new machine, but appearances do matter.

P.Brian.Mackey
  • 11,123
  • 8
  • 48
  • 87
  • I find this is an important and inexpensive (in monetary terms!) form of motivation. It generates all sorts of positive attitude to the company and work, gives you the feeling of being valued, ... It's not to be forgotten that "brain workers" don't work for money. – slovon Jul 19 '11 at 06:31
  • Agreed. If people enjoy their work, they're more likely to do good work (to The Obviousmobile™). Getting great tools seems like a very easy way to increase employees' enjoyment. – Jonta Jul 20 '11 at 10:55
12

Your math doesn't seem to include the time required to manage the constant flow of hardware into and out of the company -- it would take an extra IT guy or two depending on the size of your company, so tack another $50-$100k/year on top of your numbers. Plus, you lose productivity on the day they swap your computer out. If they skimp on dedicated IT staff you'll have to do the backups and restores yourself, possibly losing a day or two in the process. In other words, I think it's a bit more complicated than you think it is.

Bryan Oakley
  • 25,192
  • 5
  • 64
  • 89
  • 5
    It may well be more complicated than I'm figuring, but I'm not asking for more frequent upgrades, just better quality at the time new hardware is purchased. – Eric Wilson Jul 18 '11 at 20:09
  • I generally found that even after they (enterprise IT) did the backups and restores I still had to fix things. I generally asked them not to do anything other than give me the standard image; I would take care of the rest. (Also an opportunity to clean things up a bit.) – Ken Henderson Jul 19 '11 at 01:37
  • 4
    What you say is true, however it also ignores the fact that most of this still needs to happen anyway. The posters idea is to go from high to low on the scale, not low to very low. – Paul Wagland Jul 19 '11 at 12:14
  • This is one of the more realistic answers. Especially for very large companies, the desktop services division of IT support is aligned around macro-efficiencies, which means policies that are effective for 20,000; 50,000 or even 100,000 employees, of which usually only a tiny fraction have specialized needs like a developer. The cost of handling those exceptions in the context of the huge machine can be quite large. – Rex M Jul 19 '11 at 17:34
9

One problem with your argument is cashflow. If they don't have the money, the point is moot. The other is return on investment.

This may not apply to the companies where you've worked. Some companies are highly leveraged and/or cash poor. They would rather spend the savings you describe on something that will sell more widgets or software. You have to show that your gain in production outweighs an equal investment in other areas.

If a software company is in maintenance mode and needs more sales, there may be a better return on spending the money on sales and marketing.

I think you need to address the fact that in your case, the money is better spent on a programmer, than another area of the company.

Be careful with this argument if you're on salary. They'll just want you to work harder to make up the difference ;)

Neil N
  • 602
  • 3
  • 11
JeffO
  • 36,816
  • 2
  • 57
  • 124
  • 6
    Then they shouldn't be hiring developers. Sure, if you've no money or there's no prospect of the investment being repaid, you can't/shouldn't be spending. The irrationality is in spending a lot of money on expensive resource (developers) while pennypinching on a cheap resource (hardware). If the excuse is that these are separate budgets, that just pushes it back a step: the irrationality is in having a massive personnel budget combined with a tiny hardware budget. – rwallace Jul 19 '11 at 13:04
  • 1
    Company can borrow money to buy better machines. – Kamil Szot Jul 19 '11 at 19:17
  • This is a bad management attitude: "Be careful with this argument if you're on salary. They'll just want you to work harder to make up the difference." I hereby promise to work 0.5% harder to make up the difference between buying $750 worth of hardware every three years, and buying $2000 worth in that time. (I don't need to promise that, since my better tools will almost certainly make that happen automatically, but we'll just ignore that point.) I could understand maybe having a few tough months, but these expenses should be extremely manageable. If they aren't, your company is in trouble. – PeterAllenWebb Jul 20 '11 at 15:59
8

I made this argument at my work for switching from laptops to desktops. I said everyone should be on a desktop and if they need a computer at home - get them one there too.

The speed advantages of a good computer are not negligible, especially if you remove crashes from really old hardware.

Concerning "top of the line" and "near top of the line" - I would argue near top of the line is always where you should be. At "near top of the line" you can upgrade every 2 years instead of 3 and end up with better hardware on average.

I recommended cyberpowerpc.com and my company let me purchase a PC from them (marketing guy), but they bought all the programmers pcs from Dell because the support was worth the extra cost. Think about that... its 1.5-2x to buy a PC from Dell, but you all appreciate if the PC goes down and you can't fix it fast you lose money.

A slow PC is like a broken PC you aren't repairing.

Chris Kluis
  • 101
  • 3
  • BTW - every developer should have a PC worthy of powering dual 1900x1200 monitors. If your PC cannot do that then it is definitively time to upgrade. – Chris Kluis Jul 19 '11 at 12:14
  • You switched *from* laptops to desktops? I just don't understand some people. I'd much rather have the laptop. It goes to meetings with me where I have everything at my fingertips to answer questions and make quick notes. I can easily work at home without spending time configuring two work environments. It's also a free second monitor. – Zan Lynx Sep 30 '11 at 20:55
  • I heard somewhere that Microsoft frequently gives some employees two computers. This is so they can hit compile on one PC and switch to the other while the first is busy. I have no problem with also providing a laptop, but the speed difference is tremendous for a laptop compared to a PC and the monitor on most laptops are a joke. – Chris Kluis Oct 03 '11 at 11:52
6

There's also a question of budgets - usually developers are paid out of a different budget than hardware for said developers, and their might simply not be enough money available in the hardware budget.

Timo Geusch
  • 2,773
  • 21
  • 15
  • 4
    Arguably that doesn't fully answer the question (it's more about the mechanics). The follow-up would then be *why **is** the hardware budget undersized*, if you accept the premise that you should spend e.g. 2% of developers' salary on workstations? – Andrzej Doyle Jul 19 '11 at 09:26
  • 1
    @Andrzej, you do make a good point. Part of it depends on the size of the organisation - large companies seem to be especially reluctant to give developers high-spec machines as they tend to have standardised their hardware on the Excel jockey level. Smaller companies usually are more flexible, but also have less money to throw around. – Timo Geusch Jul 19 '11 at 20:46
6

First, to answer the question asked:

They can't do the Math or if they do, they somehow believe that it doesn't apply to them. Budget and accounting for hardware and personnel are separate. People in decision-making position never heard of the issue and are totally unaware that a problem exists at all.

Now, to the real question: "How do I handle this situation?"

It's essentially a communication problem. You explain the problem and the interlocutor hears "bla bla bla we want shinny new toys". They just don't get it.

If I were in your position, I would make a quick video titled "Can we afford old computers?": Stills of a typical workstation. On the right side, a blank area titled "cost".

Still of the power button. Below: "Starting the computer. 20 minutes". In the blank area, "Starting the computer = $40". "Opening IDE = $5", "Computer freeze = $80", "building the product = $600"

Run through at a quick pace and keep adding the numbers then compare with the cost of a new computer and don't forget to end with "This video was produced at home on a $500 store-bought laptop that outperforms all the "professional" development machines currently available.

If you are concerned that raising the issue will cause problems for you, you could also just bring in your own laptop to work.

If there is no way to get that issue across, then perhaps you should consider finding another job.

Sylverdrag
  • 121
  • 3
4

Discounts play a big part in the buying process as well.

Spit ball (not real numbers): 100 machines @ 1000 w/ 15% discount = 85,000

90 machines @ 1000 w/ 10% discount = 81,000 + 10 machines @ 2000 w/ 5% discount = 19,000 => 100,000

As has been already mentioned, the extra cost in supporting the "special" machines needs to be added in the mix.

bart
  • 91
  • 1
  • 4
    Would there really be much support difference if they were the same machines with more RAM, and maybe faster hard drives? – Eric Wilson Jul 18 '11 at 20:10
  • @FarmBoy - RAM upgrades are usually simple and easily implemented. HDD's - more problematic (because it's more expensive), but doable. I've upgrade RAM on my laptop, just had a PO signed by my boss, did it myself. Replacing the whole laptop though was entirely out of the question. So there's a difference. – littleadv Jul 18 '11 at 21:51
  • @FarmBoy: Depends on "faster". If you're talking about replacing a 5400RPM with a 7200, then probably not, as those are both fairly common. If you mean replacing a 7200 with a 10K, then possibly, as 10K drives are less common, and therefore may be harder to source. And the failure rate on SSDs is high enough that the company will probably have to stock a few replacement drives, so that adds up too. RAM is definitely worth it, though. – TMN Jul 19 '11 at 11:57
4

Personally I have always had at least an OK development computer when I worked for a 'small' company but when it comes to Big companies, programmers are a dime a dozen compared to a project manager having a budget.

Specially if he/she is one of those having great ideas, read: budget approved.

Whatever the 'good' idea, that person will need really good programmers to actually implement the " New 'better' product" so they will pay the programmer the price needed.

Getting the new development computer, as far as I have been concerned, does not goes through the same 'department' as the other budget though so do expect working under bad conditions if you are paid well :-) My last work: Dell E5xxx + One LCD 1280x1024 ...

Valmond
  • 501
  • 1
  • 4
  • 7
  • Large companies are taking a beating on this site today. I take exception to your claim about a dime a dozen. It needs to be reworded to "average and bad programmers are a dime a dozen". If you are good, particularly if you are very good at a large company you will get noticed and you will not be regarded as a dime a dozen. If you work (worked) for a large company and felt like you were regarded as a dime a dozen then I would suggest that you might not be as good as you think you are as a programmer. Very talented programmers are a rare find although everyone thinks themself as very talented. – Dunk Jul 18 '11 at 22:47
  • 1
    Nah you got it all the wrong way, what I try to stress is that even if that project manager can pay you for what you are, the guys over at 'buying the computers and maintaining them' doesn't run on the same budget. I earned more a day on my last job than that computer costed... Should I have stayed longer I would probably have bought myself another computer + screen but htere were other problems like working in extremely hot and noisy environment (because that was cheap not because there were any real need). – Valmond Jul 19 '11 at 08:09
  • Ok, maybe I have some points wrong, but ALL WRONG. LOL. My point was that if your manager places high value on you then they will see to it that you get the equipment that you want regardless of the politics involved. Of course that assumes a minimally-competent manager. – Dunk Jul 19 '11 at 18:13
  • Of course I didn't mean "ALL WRONG! BAN! BAN!" :-) and sure if you work for say at least a year in any company and you haven't got the tools required for working at least 'correctly' I'd say you'd better quit and find another. Big companies are complex though and even if the project manager is smart and listens to you, another department might not (listen to him). Well, that's my experience anyhow :) – Valmond Jul 19 '11 at 18:37
3

I was asked to spec out the machine I wanted to use here, within a fairly tight budget. I managed to come up with a halfway decent system that works despite not being perk heavy.

I was originally thinking along the same direction as the OP here, the time I sit here waiting for compiles or loads is money out the window. As I've been moving along I also recognize that the time I spend going to get a coffee, or walking to the printer are also money out the window.

Rather than worry about the small amounts of time that I do have to wait, because we went with a less expensive development system, I've looked at my own habits and improved the larger amounts of time I spend doing nothing particularly useful (ahem... stackexchange is useful, and productive to boot, and I'm sticking to it!! :-) ) Of course we need breaks, but this is time other than "breaks".

So in a way, in a general sense, this question could be the "premature optimization" of work efficiency. Many great points about migration costs, loosing out on volume purchasing, etc.

In your particular situation, where you are losing time on the order of a break in order to reboot/open programs, yes, it makes a lot of sense to upgrade to decent equipment as your productivity is seriously impaired, a halfway decent i3 system with 4 GB RAM is on the order of $500 ... I'm sure it won't take long to recoup that cost.

jm01
  • 101
  • 1
  • 5
Stephen
  • 2,121
  • 1
  • 14
  • 24
  • It could be "premature optimization" if I was starting a company without having seen hardware as a slowdown. But currently it seems a significant bottleneck, and a cheap one to fix. – Eric Wilson Jul 18 '11 at 20:02
  • 3
    You need breaks regardless. But minimizing breaks in flow is critical to developer productivity. If a developer has to wait more than about 30 seconds to get feedback from the previous action, work will slow significantly. – kevin cline Jul 18 '11 at 20:11
  • @FarmBoy If it's a significant bottleneck, then making a business case for it to management makes sense. – Stephen Jul 18 '11 at 20:11
  • @Stephen If this company were 1/100th of it's current size, I would consider making that case. – Eric Wilson Jul 18 '11 at 20:16
  • @FarmBoy As I said in the edit above, as an example, $500 can do wonders for your productivity, if you can't get them to see that case I think you have other questions to answer. – Stephen Jul 18 '11 at 20:22
  • 1
    +1, you can definitely get a sweet machine for not much money, if you optimize for developer productivity. Good graphics card? Almost certainly a waste of money. Huge hard drive? Often not necessary. But RAM? As much as you can get. If you spend smarter, not more, you'll do well. – Carson63000 Jul 19 '11 at 00:52
  • I think the idea is to take breaks when _you_ need to take breaks, not when your _computer_ decides you need a break. – sevenseacat Jul 19 '11 at 05:21
3

Buying new hardware involves money, money involves decision makers and usually they're not developers if your company is big enough. Of course we have exceptions...

As @Rob explained, there is a lot of reason why you'll not get the best hardware. Your company may have a policy defining what kind of hardware is bought, as always with bureaucracy it's hard to have a bleeding-edge policy. Many managers won't bother adapting it to your personal needs, etc.

Poor communication, risk aversion and other flaws:

Let's consider you have reaaally crappy hardware, it's no longer possible to work in these conditions and you want to do something about this.

Now you have to go convince your manager. Well, usually you'll have to convince your project manager who tells your manager who reports to his boss and you'll need to make sure that that guy really understands your issues.
Involves communication skills and the technical understanding of the management.

Second step, if you're lucky enough, the management will think about it. What do they get ?

  • You'll work faster with some uncertainties (they don't directly get money as you'll try to explain).
  • It'll cost money, now.

That means they'll have to trade money, and their actual planning of your work, for an eventual opportunity to let you do something else in the future and that, that's an investment but also a risk.
Sadly, many managers are risk-averse. Not to mention that the poorer their understanding of your issue, the riskier it appears. Some may also have a hard time recognizing that someone did not buy the suited hardware in the first place.

Moreover, management usually has a shorter definition of what long term means. If they're asked to do some sort of monthly budget optimization, they may even have direct financial incentives not to buy you new hardware! And they won't care about the two weeks you may save six months later..

Of course you don't always have to wait so long when you can do wonderful stuff in one day!

That works better if you have smart and open-minded managers who listen, understand your issues, are ready to take reasonable risks and trust you enough to let you explore creative ways to use the freed time.

That's not always the case: I waited 3 months to get a graphic card to connect my second screen while being forbidden to buy it myself (30€), lost 3 days for not having an extra 500GB HDD, regularly had to wait several hours when preparing data for the client because of the slow 100Mbps network. After asking several times for 2GB of ram, I was told to buy it myself and stop bothering the management with those technical issues. And we where doing scientific computing for a big industrial client who was ready to pay the price..

Maxime R.
  • 101
  • 4
  • 1
    Well said, good analysis on the *why*. However, if it's going to bad, you can dissipate some upgrade spray through the dedicated case openings (http://www.globalpackagegallery.com/main.php?g2_view=core.DownloadItem&g2_itemId=46117&g2_serialNumber=2). – peterchen Jul 19 '11 at 15:45
  • Lol, upgrade spray, they would've have loved it! Hopefully I no more work for them :) – Maxime R. Jul 19 '11 at 15:59
3

Math aside, all of your users are not likely to have top end machines. Developing on a machine that is spec'ed more closely to something that is average in price will acquaint the developer more closely with the experience (and pains!) of their users.

Your QA department may have a min-spec machine, but how often is it used? Developing on a machine that is a realistic target environment exposes issues early on (unresponsiveness, poor performance, race conditions because of that slow performance, etc), which drives teams to fixing them sooner.

Justin Johnson
  • 101
  • 1
  • 3
  • Of course, this doesn't apply to those of use that don't write desktop apps. – Eric Wilson Jul 19 '11 at 08:25
  • Sure it does. Flash apps and even heavy JS web apps also benefit from running use lower spec machines. – Justin Johnson Jul 19 '11 at 08:41
  • Fair enough. I should have said, "This doesn't apply to those of us that don't write desktop apps, or apps with heavy client-side interaction." Which is still a lot of devs, and ironically, these are among the most likely to be on poor hardware. – Eric Wilson Jul 19 '11 at 09:02
  • 8
    I've heard this before, and I think it's a false analogy. If it were true, then cars would be built using hand tools and power drills because that's what drivers have at home. A low-spec machine should be used as part of usability testing, but not for development. – TMN Jul 19 '11 at 12:06
  • I am just optimizing for netbooks because I currently do most of my hobby development on one of those. This includes improving graphics rendering performance as well as UI optimization. This has my up-vote. – phresnel Jul 19 '11 at 12:08
  • 1
    this answer points out an interesting thing. I've seen a game which failed so badly when it was released: most of the users couldn't read the texts in the interface because developpers had 21-27inch screens at least, and scaled down to those laptops 15inches characters were rendered at 6px. However, being close to user's specs is needed for tests, which should be done by testers and not developpers. – BiAiB Jul 19 '11 at 14:13
  • I think a lot of answers/replies here have many exceptions. In the case here where the developer should also be doing some testing, they need to feel the pain of their target market. Best case, they have two computers: one for coding, one for testing. – Lynn Jul 20 '11 at 20:28
3

One big factor is the kind of bloatware that the IT in a typical big company tends to put on the laptop. If you have a Windows 7 machine at home and just some antivirus, a standard SSD-3GB-Quad-core system will boot up in less than 10 seconds. Compare that to the bloatware my company puts in, and it takes forever to boot. I have seen some folks using zapping the OS completely and installing their own to speed things up. I think that solves a problem to an extent, although it is a huge InfoSec violation. But seriously - 10 minutes?!

  • That counts the time to open Lotus Notes, Eclipse, Firefox, and maybe a few other things. – Eric Wilson Jul 19 '11 at 20:58
  • 10 minutes? My work machine is a Dell E-series laptop. Time from cold boot to having Visual Studio and Lotus Notes open averages 18 minutes. It usually works out to around 5 minutes to reach the Windows login prompt, then another 12 or 13 minutes to reach a usable desktop. – Joshua Smith Jul 20 '11 at 15:11
  • 1
    IT here on loan from Serverfault. 10mins is inexcusable, but unfortunately common. When I start at a new shop I spend the first few weeks turning off all the crap that someone thought would be a good idea to run on startup. Antispyware scan -> Antivirus scan -> 100s of nested GPOs. My new Win 7 desktops boot so fast I had to tune the switches because they were booting faster than the NICs could auto-negotiate. Hell I can **reimage** a station in less than 10minutes. – Ryan Jul 20 '11 at 19:52
3

In large corporate organisations the choice of hardware is pre-defined and locked down due to the fact that such organisations have fixed, centrally managed desktop and laptop specifications and configurations. The specifications for these will have been dictated overwhelmingly by a combination of "procurement" and "support" considerations. The company I am currently working at, for example, has over a 100,000 employees and they work on the basis that "one size" fits all, and that size will have been primarily driven by commercials. Once such policies are in place, they are locked down because support services usually invest a considerable amount of time in testing and deploying the software to that "standard" machine specification. Arguments around "developer" productivity, in such environments, simply fall on deaf ears; production services are not going to make an exception for a small group on the basis that they may be more productive; if they did so, they would quickly be swamped with requests for deviations, and in any event they (production support) are incentivised to keep the support cost as low as possible. > 1 desktop/laptop configuration increases the support cost. In an organisation where the primary "product" is the result of software engineering, such arguments are invalid, but the reality is that most organisations are NOT, and the key driver is keeping support costs low.

2

Simply because, best hardware does not make 'best' developers! That being said, the company is to blame if it is hindering the work of the programmer.

However, if the hardware is sufficient for the developer to work, then he has nothing to complain about.

Also, no point in having the 'best' hardware and using only an IDE to code - waste of resources that way.

Sterex
  • 101
  • 3
2

"We have met the enemy and he is us." - Pogo

Either way you slice this question - the collective group "programmers" bears direct responsibility for any failure to buy the best tools in the workplace.

  1. Business finance is incredibly complicated with numerous conflicting motivations and levers. Without concrete knowledge of what your finance department is currently tracking (tax avoidance, managing quarterly expenses, driving up future capital expenses, maximizing EBITDA or whatever else is on their radar), any discussion of true costs is irrelevant. How would you react to a marketing person bugging you about compiler optimizations for code you know is about to be transitioned to an interpreted language? If programmers can not demonstrate in specific terms how the tools they have don't contribute directly to the bottom line, the business is correct to spend as little as possible. We also have to learn to listen to business finance so we can understand the realities facing resource allocation.

  2. We as a group vote with our presence in the workplace far louder than asking for better tools, submitting the most awesome white paper to our managers, or even posting on the internet. There are organizations that have created a culture of ensuring its employees either have the tools they justifiably need or understand the case as to why not at the moment. Until competitive pressure requires this from the majority of employers, we can only vote by seeking out employers we believe in.

Each of us has to either make this something that matters to the core, or let it go.

bmike
  • 182
  • 2
  • 10
2

I used to be a developer at a large company and then a startup. Here are my two cents:

  1. 8GB DDR3 DIMM (2x$4GB) costs $50-$55 today (Circa july 2011)
  2. 21" LCD Monitor costs $200 (circa july 2011)

If your company allows you to bring your own equipment, just use your own $ and upgrade the RAM and LCD monitor. Why you ask?

  • isn't your own productivity something you value?
  • aren't your eyes worth $200?

You can always take the monitor with you when you quit the job (remember to clearly label it as your personal property). I've done the above recipe (upgrading RAM and using my own LCD monitor) in both my previous jobs - and my current job.

Init Fini
  • 1
  • 1
  • I often buy my own machine for work. I spend 8+ hours a day on that computer, it's worth $2k every couple of years to work on something fast. – karoberts Jul 19 '11 at 18:13
2

I don't see how you can group all employers together in one basket. I've worked for a few employers as an employee and as a consultant and always got hardware that was more than sufficient for my needs - for current job I was handed a bright shiny new HP quad core with 4 gb ram and Win64 on the first day - not top of the line, but very sufficient - (I use Delphi XE and XMLSpy as my main development tools) - in fact so nice I went and bought the same machine for myself at home. (Maybe I'm not all that productive! LOL.)

If you don't get good hardware, try asking for it - and if you feel you can't ask for it, you're probably not working at the right place because they don't view developers as a resource, but as a liability.

So I guess the answer to your question is: those companies that don't and/or refuse to provide sufficient hardware for a developer are companies that consider their developers a liability - jobs they'd rather outsource and not deal with at all.

Vector
  • 3,180
  • 3
  • 22
  • 25
2

CFO side.

The company has a lot of expenses. Every department needs more $ in order to do better and in every department the expense is a must.

when you come to choose the best way to use available $ you take into account:

  • how much do they need? smaller sums are easier to approve.
  • will it increase sales? better pc's usually don't contribute directly to increase of sales
  • does the department like to spend $ or do they understand cash flow. Most r&d departments I have seen have an arrogant "we deserve the best" approach. This is understandable as they earn a lot of $ and when you do you think you deserve the better things in life. $ needs of r&d teams usually give a feeling of a spoiled child requesting more toys while his parents are struggling. "A delicate genius".

The 10min a day waste is not a reasoning that would work with most finance departments. Most r&d teams waste a lot more on all the none programming activities they enjoy during the day. Lets chart all the waste in your department and see what can be done to improve productivity.

3seconds
  • 1
  • 2
  • Your biggest problem as CFO is that all your departments are bringing you positive-ROI propositions and your only problem is to figure out which is the *most* positive ROI? Sounds awesome. – PeterAllenWebb Jul 20 '11 at 16:44
  • 10 minutes @ $60/hour is $3600/year, or $10,800 over a 3 years period (computer's life). At $100/hour, $18k. An acceptable machine can be had for $800. There are other costs beyond the simple waste of time as well. Recruitment cost, for one. I am never going to work 10 hours a day in front of a CRT monitor, for a company which consider that a $50 saving is worth damaging my eyesight. You are saying that a company which can afford to waste $$$ on paying engineers to stare at a frozen screen & sabotage its own recruitment efforts ($$$) can not afford tools in proper working condition? – Sylverdrag Jul 22 '11 at 03:25
1

Simply put, purchasing decisions are often made by bean counters (Accountants, and middle managers) rather than by project managers.

Lots of people have given potential reasons, and all of them are a factor in one situation or another, so there isn't any single overriding situation. Buying large scale equipment may mean they lose some money on productivity for programmers, but gain money in other areas.

Still, it often just comes down to a budget. You have to fit in the budget, and that's all there is to it.

Erik Funkenbusch
  • 2,768
  • 3
  • 22
  • 27
  • You bet, the Accountants would **love** big monitors for their spread sheats, but the IT departments want to give anyone the same kit as they have been for the last n years! – Ian Jul 19 '11 at 08:37
  • 1
    That doesn't explain why programmers can't talk to the bean counters and demonstrate why money is being left on the table by the business by failing to get the correct tools. The budget serves the business need - programmers have to demonstrate needed tools to expect budgetary consideration. – bmike Jul 19 '11 at 15:03
  • 1
    @bmike - I don't know about the companies you've been at, but in most cases programmers are not allowed to talk to the bean counters. I mean, nothing stops them from stopping them in the hall and having an informal conversation, but they would ordinarily tell them to "use the chain of command" – Erik Funkenbusch Jul 19 '11 at 20:42
  • 2
    +1 - to get it back at least to 0 - IMO this is a very well informed and accurate answer, particularly in larger shops. The developer should talk to an accountant about how he needs to spend $1000 more than regular people on his hardware? Hard to imagine... – Vector Jul 20 '11 at 13:41
1

I used to work for a networking company where they upgraded ram from 512 MB to 1 GB last year. We were working with f**king CRT monitors in 2010. The funniest part was the managers' hardware was upgraded to 2 GB ram. Why on earth would anyone want 2 GB to create damn PPTs and how someone would develop applications with 1 GB ram, I would never know.

1

It comes down to who handles the money. In larger organization IT is given a budget of say $1M for the year. That includes support salaries, servers, etc. They have to spread it around between all their resources. They cut deals with vendors like Dell or IBM to get x number of the same kind of computer. This they give to everyone from customer support to the programmers. They also get deals on support etc., when they only have to maintain a limited set of models. They are not programmers either, I have had numerous arguments with non-programmers about computers. When I went over my IT managers head for some new HD one time, the CEO said buy them and boom, everybody finally had enough disk space to run virtual machines.

I actually blew up and cussed out my boss because IT was going to take away my 19" second monitor because I had a laptop. They stiffed me on that too, giving me a 13" model when others were getting 15". That goes back to politics in IT which is another problem. It's kind of an us vs. them thinking sometimes.

Bill Leeper
  • 4,113
  • 15
  • 20
  • When I worked for a very small company with LESS BUDGET than any of the other places, I had the nicest machines. Why? Because there wasn't even ONE IT PERSON'S SALARY coming out of the total annual IT budget. Want a fast rig? Be a sole developer in a non-software-shop that has no IT people except you. Or start your own company, and don't go down the road of IT insanity. :-) We ran non-engineering machines into the ground, but engineers (CAD/R&D/Developer) got top end stuff. Because engineers ran the company. – Warren P Jul 20 '11 at 23:35
1

From the perspective described by the asker, the question makes complete sense. However there are more costs involved with keeping hardware current.

Here are some of the costs that also need to be considered:

  • requisition cost (research and details that goes into purchasing)
  • installation & configuration cost
  • support & maintenance cost
  • software licensing cost
  • disposal / upgrade cost

In some cases, these can be 2-5x greater than the cost of the hardware itself. Even more if there is sophisticated software licensing involved.

In general the scale of these costs depends on the size of the company or the complexity of the organizational structure. Smaller teams with direct access to purchasing power can keep these costs low, whereas in a larger organization these costs can get very high.

Joshua
  • 103
  • 5
  • My premise was that better hardware could be purchase, not purchasing hardware more frequently. That eliminates all of the costs you mentioned except possibly additional support & maintenance. – Eric Wilson Jul 19 '11 at 15:24
  • First of all, all of those costs MUST get factored into the total cost over the life of the machine. So instead of it being $3,000 to purchase a PC or Mac, it could be upward of $6,000-$10,000. You can't just look at the initial cost. You have to look at the overall cost from an ACCOUNTING perspective. Second, "Better" is only relevant for a fixed period of time. I have found that most companies will buy "better" hardware for their teams - but then hang onto that hardware for 3-5 years or even more. Not cool, especially for software developers. – Joshua Jul 19 '11 at 15:33
  • 1
    my premise is that the other costs, while relevant, are the same. In other words, acquisition and installation costs don't increase because the devs get more RAM. Also, I've argued for the same schedule of purchases above. – Eric Wilson Jul 19 '11 at 15:37
  • 4
    What software do you run that costs 2-5x more to license if you put it on a faster desktop machine? @Farmboy is right, this is an anti-point. If a crappy computer costs $1000 to buy + $1500 in IT costs over three years, it's half the price of a great computer that costs $3000 up front + $1500 in IT costs. And in fact, the better computer probably costs less to support, because it breaks less often. – RoundTower Jul 19 '11 at 22:20
1

Because a lot of companies outside of typical tech start-ups are not interested in hiring rock-stars. They're investing in someone who can just do work. So if they don't care how you do you work as long as you do it why should they care what equipment you use? I've worked at places that still use 15-inch CRTs and everyone does just fine. Sometimes when i read questions like this I wonder if people realize that not everybody in the world works for a cool start-up.

Sergei
  • 141
  • 5
  • 2
    I don't work for a cool start up, and I don't think that everyone else does. But I do think my employer should care whether I have equipment that works well, whether they want rock-stars, or just effective developers. Primarily, I expect that my company would want to avoid wasting money paying me to watch my machine freeze up again. No-one thinks wasting money is cool. – Eric Wilson Jul 20 '11 at 16:27
1

I've worked for companies that skimped on hardware in the past. It sucks, and if they need convincing the battle is likely to be a never-ending one.

Turns out that companies committed to using the best available tools are rare, but they do exist; I work for one. I've got a quad-core 17" 2011 MBP, 8GB RAM, Vertex 3 SSD, 2 x 24" external monitors, plus a quad-core desktop and a 4GB Xen slice; as well as quiet offices.

Could I get by with lesser hardware? Sure. But I think we'd all rather be bragging than bitching.

1

In my opinion, there are only two defensible objections a company could raise to keeping developers set up with solid workstations. The first is that they are undergoing a cash crisis. That better be short lived, or the company will not be a going concern for long. If you work for a company like that, you should keep your resume up to date.

The other is that their organization is simply not bottle-necked on software development capacity. That is, an increase in the quality or speed of software development output would not improve the bottom line. If the company's main business is selling software, that will be practically impossible. If software isn't their main business, and they aren't bottle-necked on it, they should be trying to reduce their software workforce by transferring or letting go of their weakest team members. Supplying poor equipment will reduce the size of their team from the opposite end, I'm afraid.

PeterAllenWebb
  • 933
  • 8
  • 12
0

I must have missed the author's perspective.

First, Google as one example was founded using cheap, "disposable" hard drives attached to older servers run as a farm. OK that might be hyperbole, yet see: http://en.wikipedia.org/wiki/Google_platform#Original_hardware

Second, it doesn't take much CPU or graphics resource to run gvim. So maybe your choice of development environment is the problem.

Third, there are dozens if not hundreds of CPU-intensity-reducing ways to enhance productivity which have little to do with whether or not you have 2 gig of RAM or 3 gig of RAM. Watch an average programmer over their shoulder to see this: for example, using a lightweight PDF reader vs. Adobe suite for the documentation; using a minimal installation of a VM for testing apps rather than a full install; removing all those startup daemons bundled with Win DELL machines (using regedit); using a lightweight browser to webmail instead of keeping outlook running; not opening 50 gazillion tabs in firefox chasing solutions to MSFT implementation issues on the web; etc etc etc.. So this point boils down to the following: Prove you need more Memory and Mhz to solve this software design problem faster.

  • I get your point, but some choices are out of my control. We have to use Lotus Notes, and it makes Eclipse look like vim. And it's great to say 'use vim' but vim for Java isn't what I want to do. (I like it for Python and simple text editing, though.) – Eric Wilson Aug 12 '11 at 23:24
0

New machines, newer technologies mean newer problems. Not everyone at every company is a techwiz and not every company has the IT resources to train people and handle problems 24/7.

Yes, perhaps if you're a freelance programmer working on your own personal desktop it would be worth blowing $1000 on a rig to squeeze out 10 min of extra productivity everyday. However when you're deploying hundreds of these machines to people who may lose productivity because of new equipment, the prospect seems a little more grim.

tskuzzy
  • 732
  • 1
  • 7
  • 12
  • Sure, I don't expect SSD's for everyone, but how about giving developers 8GB of RAM, or 4GB, instead of 2GB? And I'm not suggesting buying hardware more frequently, so deployment isn't an additional cost. – Eric Wilson Jul 18 '11 at 20:01
  • @Farmboy - someone has to do costing analysis which parts to buy at a good cost, test the specific RAM modules with the standard IT configuration to ensure supportability and minimize parts replacements, stock up an inventory, and do that for every current configuration, which for a typical large IT department can be anywhere between 3 and 8 at the same time. The alternative is each developer gets their own budget to spend as they wish on hardware; however, support becomes a nightmare. (Not to mention people that spent their money on a USB powered liquid cooler, instead of more RAM or CPU) – Franci Penov Jul 20 '11 at 00:24
0

Once I tried to argue for the company (largish) to buy us developers decent consumer grade systems. Essentially the performance specs on them were comparable to the Enterprisey version but at 1/2 the price. My argument was at these prices they were essentially a throwaway so if it broke just buy a new one (on the assumption that better than 75% would last 24 months). I suggested that in exchange for getting one of these laptops the developer would have to sign an agreement (or something) that he/she would be responsible for the SW load/configuration and help desk would not 'help' fix it.

It didn't fly but I thought the basic premise of the argument was reasonable, considering we did windows dev and all of us were local admins.

Ken Henderson
  • 755
  • 6
  • 10
0

Why not? Because it's not accountable. We can't precisely match each hour of work with a profit margin.

A simple solution for this would be refunding whoever pays for his own machine upgrades. If your counting is any true, it should be easy to prove your own profit from production improvement by comparing the past 2 periods (week, month, semester, year or whatever) in the exact same job / project.

If developers were able to quantify how much they are generating over a period, the issue disappears. Most developers can't. Nor their managers and even less the finance folks. Because the job is very subjective.

But if you can somehow show those numbers (I know I can't), then you're all set for your cost-effective-non-self-booting-dream-machine already!

cregox
  • 677
  • 1
  • 7
  • 14
  • 2
    The only problem with this idea is: how do you prove a productivity improvement? How do you measure developer productivity in a way that won't be gamed? – EMP Jul 19 '11 at 12:05
  • Of course it's accountable. Everything ends up on a spreadsheet - the problem is a difference in opinion on the value that is placed on items on the accounting - not that it was overlooked. – bmike Jul 19 '11 at 15:04
  • You both said basically the samething which is actually in agreement with what I meant. I didn't gave a solution on how to get the right numbers and neither did you. @EMP yes, that is the whole issue I tried to address. @ bmike (hope you get notified, though certainly not by this) if you can *count* but not place the proper value, that's what I meant by "not accountable". – cregox Jul 19 '11 at 21:23
  • 1
    @EMP: You don't have to prove a productivity improvement. You only need to prove downtime. An employer already pays you $XX per hour. Additionally, he also pays taxes and benefits, so the ACTUAL cost per hour is even higher. He knows how much he pays you. All you have to prove is how much of that time was wasted due to faulty equipment. The OP talks about a 20 minutes boot time (no matter how you put it, this is as faulty as it gets). At $150/hour, each reboot costs $50. Show your boss the "power" button and tell him "Every time I press this button, you pay $50". – Sylverdrag Jul 20 '11 at 11:33
  • 1
    @Sylverdrag, good point, however that's only proving the problem - not the solution. Proving "lack of downtime" is harder. – EMP Jul 20 '11 at 12:10
0

New programs run great - on the developer's computer. Buy a developer a 4 GHz 8 core box and the application he creates will run fine - on any 4 GHz 8 core computer. But on a typical customer's computer with 2 GHz and 1 core it runs like a dead snail.

Developers naturally keep adding features and code and levels of indirection until things slow down, on the development machines. If you're only developing for brand new hardware, then buy the latest. But it's a danger if you sell software to people with existing hardware.

A developer's computer should be about the same power level as the target customer's computer, with perhaps a bit extra for the debugger. But no faster.

Andy Canfield
  • 2,083
  • 12
  • 10
  • This is huge. The devs at my place have nice, modern PCs. Unfortunately, we have 40,000 end users scattered all over the place, some of whom have old P4's with 256MB of memory. The app that uses 200MB of memory doesn't run very well. – duffbeer703 Jul 19 '11 at 12:36
  • 3
    I'd say give the developer as powerful a machine as possible, and then simulate user scenarios in virtual machines. QA and VMs should give a clear picture of how the product will perform in the wild, and then you can let your devs work as fast as possible.. – mash Jul 19 '11 at 12:51
  • 2
    Also, add explicit performance requirements. If there's a valid reason for your software to run on Pentium II machines, the developers can deal with this requirement, while still using more modern machines. Besides, feature creep was always an issue. You can add pointless features on any machine, irregardless of what is state of the art workstation performance. – Tadeusz A. Kadłubowski Jul 19 '11 at 13:04
  • This doesn't apply to those of us that don't write desktop apps, or apps with heavy client-side interaction. Which is still a lot of devs, and ironically, these are among the most likely to be on poor hardware. – Eric Wilson Jul 19 '11 at 18:36
  • 1
    @Andy Canfield - One would hope there are requirements. Just because the developers of have nice machines does not mean they should design the application to run on their machine ( this just be plain foolish ). – Ramhound Jul 19 '11 at 19:23
  • 6
    So, by the same token, car manufacturers should have to work outside, because sometimes their cars will be driven in the rain? ;) – Tom Morgan Jul 20 '11 at 08:19
  • Every sane company has a process of satisfying customer requirements. It's often a separate infrastructure group that does the builds and verifies them. Developers can work on any equipment as long as they can satisfy the requirements for builds. – Gene Bushuyev Jul 20 '11 at 22:47
  • Given the question was *"Why don't all companies buy developers the best hardware?"* I think this is a good answer. I can see the manager now, nodding and smiling as s/he pretends to look at your cost-benefit analysis, thinking there is no way in Hades that s/he is going to encourage you to create bloatware. *It is important to justify to them why that would **not** happen!* – Andrew Thompson Jul 28 '11 at 14:26
  • Who down voted this? This may not be the answer most developers want to hear but it is a vary valid point. I would actually argue for the developers to have a slower system - at least to test on - so that inefficiencies in their programs were highlighted before the software was inflicted on the users. I would also argue that faster builds do not equal more productivity. – James Anderson Oct 14 '11 at 03:48
0

2GB on a developer machine is obviously shameful, however, solving this problem should not cost $3000…more like $100 (conservatively). Why make the case to upgrade everything all at once? Smart IT departments are continuously upgrading machines over their lifetime. Eventually you need an entire new machine, but your machine is not running hardware specs for Windows 95; it could be upgraded for $300-$500 into a typical mid-range machine, and these upgrades could happen over several months so there is not a cash flow problem. You probably do not need a new graphics card, sound card, USB ports, DVD writer, etc., so why pay for them now? It’s like buying a new car because your AC is broken.

Morgan Herlocker
  • 12,722
  • 8
  • 47
  • 78
0

I think the "right" tools are required for the right jobs. If you don't have the "right" tools (hardware, software, or otherwise) I believe it is due a misunderstanding or miscommunication of the expectations between an employee and their bosses. This is both the developers and the company's responsibility. The higher the expectations the closer the "requirement" should be looked at.

This being said I know several developers who "need" 8 GB of RAM for their machine when I've made due with less in more trying scenarios. But again I think it's understanding requirements.

Steve
  • 1
0

At my current company, developers are pretty high on the totem pole for hardware. I imagine that hardware is put on a normal company budget just like anything else, and the need outweighs the want.

In my opinion, a developer should be responsible for their own hardware, but that depends entirely on the situation. If you are asked to write a simple app for a simple website, than you might not need a sophisticated piece of equipment to sit in a text editor. On the other hand, if you are into contract programming and want to do some side gigs you may want to consider buying your own hardware and base software, and having the company purchase individual API licenses as needed by that specific company.

Either way, it is all a matter of checks and balances, and if you are concerned with productivity than your dollars are probably best spent in monitoring how much code a developer is putting out for their time. If it takes them 10 hours to do one project and 5 hours to do a similar project, it may be an employee related issue and not so much a developer issue.

0

Companies make decisions quite differently from developers. Most have mechanisms at place providing appropriate hardware for the task, having approved purchase channels, groups responsible for installations, testing, compliance with security and other measures. So questions changing hardware specifications can be complicated.

On the other hand, let's say you came to CEO with a suggestion to spend equivalent of 1% of salary for upgrading equipment. He will ask CFO to come up with the hit they would have on the margins and income, let's say it's 5%. Now, missing the estimates, that may have an amplified effect on company's stock price, say 10%, and upper management loses their million dollar bonuses. Unless, there are good reasons to expect the upgrade would improve company's bottom line, this suggestion would be DOA. Companies always seek to increase expenditures only if it improves income. That means that in most cases both low end and high end equipment are sub-optimal.

One solution to satisfy both developers and company management would be allowing developers to pay the equipment rent, a typical system would run $20-200/month if rented for 2 years. A company can have a range of approved hardware and offer developers either a standard configuration or to choose upgraded configuration and deduct the additional rent from paycheck.

Gene Bushuyev
  • 201
  • 1
  • 4
-3

The question pre-supposes that good hardware makes a significant difference, I recently switched to a Macbook Air, reduced CPU performance, less fan, less headache. I think that a far far greater factor is the human factor: what coding language? Are you using a dynamic programming language? What is the culture? Are you running a build every two seconds? Long (and unnecessary) test-suite runs? Far better to get the environment sorted out, rarely is a high-spec. development machine needed. The good software developers, the masters of the craft. What are they? They are writers. They write code for other coders to read. Actual function, speed, machine-deployment issues. These really are secondary. So is an obsession with correctness. I say relax more, and move to a right-on, open sourced language and toolset. This is where the company $ should be directed.

Dantalion
  • 101
  • Well, all of the things you mentioned are more important than hardware, I agree. And I do suppose that bad hardware makes a difference, though less than bad development practices. But choosing to buying good hardware is easily accomplished, while all of the other changes mentioned require a significant investment. – Eric Wilson Jul 19 '11 at 12:51
  • 4
    Next time you develop an operating system using "dynamic languages" and "running builds every 2 seconds", you let me know. Many problem domains are not amenable to that kind of workflow, and I'd put most commercial software in that bucket. – Billy ONeal Jul 19 '11 at 14:42
  • You would put most kinds of commercial software into the operating system bucket? Bit far-fetched. Most development is corporate development and is web-based. As for a software house that makes and maintains operating systems, I doubt very much the questioner would be encountering a lack of hardware resource. – Dantalion Jul 19 '11 at 22:41
  • I never worked on a project where machine performance and memory have been irrelevant. If for no other reasons than local build, debug, test times all depend on machine performance. – Gene Bushuyev Jul 20 '11 at 22:50