30

I've encountered many people who are dogmatically against anything which can be considered "optimization" in the general English-language sense of the word, and they very often quote verbatim the (partial) quote "premature optimization is the root of all evil" as a justification for their stance, implying that they interpret whatever I'm talking about to be "premature optimization". However, these views are sometimes so ridiculously entrenched that they dismiss pretty much any kind of algorithmic or data-structure deviations from the purest "naive" implementation... or at least any deviations from the way they've done things before. How can one approach people like this in a way to make them "open their ears" again after they shut down from hearing about "performance" or "optimization"? How do I discuss a design/implementation topic which has an impact on performance without having people instantly think: "This guy wants to spend two weeks on ten lines of code?"

Now, the stance of whether "all optimization is premature and therefore evil" or not has already been covered here as well as in other corners of the Web, and it has already been discussed how to recognize when optimization is premature and therefore evil, but unfortunately there are still people in the real world who are not quite as open to challenges to their faith in Anti-Optimization.

Previous attempts

A few times, I've tried supplying the complete quote from Donald Knuth in order to explain that "premature optimization is bad" ↛ "all optimization is bad":

We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil. Yet we should not pass up our opportunities in that critical 3%.

However, when supplying the entire quote, these people sometimes actually become more convinced that what I'm doing is Premature Optimization™ and dig in and refuse to listen. It's almost as if the word "optimization" scares them: On a couple of occasions, I was able to propose actual performance-improving code changes without them being vetoed by simply avoiding the use of the word "optimiz(e|ation)" (and "performance" as well -- that word is scary too) and instead using some expression like "alternative architecture" or "improved implementation". For this reason, it really seems like this truly is dogmatism and not them in fact evaluating what I say critically and then dismissing it as not necessary and/or too costly.

errantlinguist
  • 607
  • 5
  • 14
  • 10
    Well, last time you had such a discussion, did you really *measure* that the performance would be bad by the purest, naive implementation? Or, at least, made a rough estimation about the expected running time? If not, those other people could have been fully correct with their opinion, you have no way to know. – Doc Brown Apr 12 '16 at 16:26
  • 1
    I think that if a program is slow as molasses, "optimization" shouldn't be a taboo topic while grooming stories without needing to necessarily do formal performance testing. – errantlinguist Apr 12 '16 at 17:40
  • 13
    @errantlinguist: If the program really is "slow as molasses", then clearly you ought to be able to easily detect Knuth's "that critical 3%" and therefore trump any arguments against optimizing it. And if you can't detect that... then you haven't done your homework yet and you aren't ready to optimize yet. So it's not clear where the problem is. – Nicol Bolas Apr 12 '16 at 17:50
  • @NicolBolas: I once found a place in the code which was extremely slow and proposed an alternative which was much faster, and the proposal was turned down... – errantlinguist Apr 12 '16 at 17:53
  • 6
    @errantlinguist: If you presented evidence of that section of code being a significant performance problem for the application, and the application as a whole was slower than it needed to be, and they still denied the need to modify the code, then it doesn't matter. You're dealing with people who are impervious to evidence-based reasoning, and thus are unreasonable. – Nicol Bolas Apr 12 '16 at 17:55
  • 7
    @errantlinguist: The key question: Were *customers* complaining that the application in that area was slow? – Gort the Robot Apr 12 '16 at 18:01
  • 1
    @StevenBurnap: Unfortunately, I'll never know if customers complained or not because I neither had direct contact with the customers nor did I have access to any sort of comprehensive "complaint list". Still, do website users actually complain about this sort of stuff?-- Generally, if they are annoyed by a site, do they not just *not visit it*? How does a company know who *doesn't* use their site? – errantlinguist Apr 12 '16 at 18:04
  • You should not fight "formal performance testing", and instead make it an integral part of the software engineering process. Make performance and capacity considerations one of the required talking points of any new software project, and if there is a risk, require complexity to be specified for the risky components during the design review. – jxh Apr 12 '16 at 19:42
  • 2
    @errantlinguist If you are talking about websites, keep in mind that the performance you see at your desk while developing will have little or no relation to what customers are seeing in the field. If your metric of success is customer visits, then the proper way to address is to measure customer visits, do things like A/B testing, etc. – Gort the Robot Apr 12 '16 at 19:44
  • Can you give any more specific examples of the kinds of optimizations you have in mind? E.g., to my thinking, using a set collection instead of a list when the primary purpose is for testing whether objects are in the collection isn't an optimization at all, it's just programming common sense. –  Apr 12 '16 at 19:45
  • 1
    @JonofAllTrades: The (worst of the) sort of people whom I'm talking about consider *everything* that is somehow not what is first proposed to be "premature optimization"--- hence my liberal usage of scare quotes... and if that list is already there in the code and you want to replace it with a set, well, is that not optimization after all, then? – errantlinguist Apr 12 '16 at 19:49
  • @errantlinguist since you keep mentioning that this is more about the discussion and not if you should, maybe this is appropriate at workplace instead of programmers? – Captain Man Apr 12 '16 at 20:04
  • Yeah, I just started browsing the Workplace SE and I'm now not sure which place would be better for it. Still, is it now impossible to somehow migrate it? – errantlinguist Apr 12 '16 at 20:05
  • 2
    I'm also going to chime in and say that if you're looking for answers to "how to work with people who stonewall a discussion the minute it has to do with performance", then this isn't the place for such a discussion. – Eric King Apr 12 '16 at 20:16
  • @errantlinguist a moderator should be able to, no sweat. Flag your question as 'moderator intervention' needed and ask to have it migrated there. If you want to talk about if you should optimize and how to go about it and record metrics this site it better, if you want to talk about how to discuss things tactfully I'd imagine workplace is better, because it's pretty generic to any job needing to improve the quality of something versus believing the product is good enough as is.. – Captain Man Apr 12 '16 at 20:29
  • Okay, flagged. Still, I have a weird feeling in my stomach that this topic is a bit too nerdy for people over there and may scare them off ;) – errantlinguist Apr 12 '16 at 20:30
  • 5
    I'm voting to close because OP is clearly only looking for someone to validate an opinion, rather than an answer to an actual question. I don't think this would (or should) stay open on Workplace.SE either. – BlueRaja - Danny Pflughoeft Apr 12 '16 at 20:30
  • @BlueRaja-DannyPflughoeft: What opinion am I looking to validate?-- I intended this to be question on how to lead a productive discussion, which is not happening with my current methods (simply having the discussion shut down before I can even explain what I mean). – errantlinguist Apr 12 '16 at 20:45
  • 2
    The problem is that "convince my coworkers" is not within Programmers' site scope. – Robert Harvey Apr 12 '16 at 20:50
  • @RobertHarvey you mean, this is the case discussed in [How do I explain ${something} to ${someone}?](http://meta.programmers.stackexchange.com/a/6630/31260) – gnat Apr 12 '16 at 21:19
  • @BlueRaja-DannyPflughoeft: well, it seems my answer does not validate his opinion. And though the question is not ideally formulated and contains some parts which might be interpreted as a rant against his colleagues, it is nevertheless answerable. The fact that a question like this one is closed by "the site police", again, however, is exactly the fact why some people here complain about the current closing policy here on Programmers.SE (see on meta). – Doc Brown Apr 13 '16 at 05:39
  • 1
    I'm not sending this to the Workplace. First, it's indeed too "nerdy" for them. Second, I prefer to keep the answers here, even with the question closed. Some of them are excellent (and not really _primarily_ opinion based), and should stay at a place where programmers can easily find them. – yannis Apr 13 '16 at 06:59
  • @Yannis: Although I do respect the decision to keep it here, why does a question which is inherently complicated (as social issues always are because humans aren't machines) have to be "closed"?-- Despite that the question attracted a bit of noise (which I tried to address), is the fact that [I found a useful, specific solution to my problem only four hours after posting it](https://programmers.stackexchange.com/questions/315520/how-to-deal-with-misconceptions-about-premature-optimization-is-the-root-of-all#comment-667584) not "proof in the pudding" that it's not "opinion-based"? – errantlinguist Apr 13 '16 at 08:00
  • 2
    We've found that the general type of questions your question fits in doesn't really work on our site (or the Workplace). This one did (sort of), but I'm afraid it's an exception, not the rule. The vast majority of social issues questions we've seen aren't as interesting as this one, and generated nothing but noise. I've personally deleted more than a couple thousand of them. And even the interesting ones don't always generate much of value; you're lucky you posted this on a time when some of our more experienced (and disciplined) users were active. – yannis Apr 13 '16 at 08:18
  • To answer your question, do not argue with people. Arguments are not won. When I run into people like that, I just express my viewpoint and leave it at that. When we need performance, I knuckle down and fix it, and let the results do the talking. If somebody learns from me, great, but I don't expect it. BTW, talking about opinions - Knuth was just expressing his opinion. – Mike Dunlavey Apr 15 '16 at 00:23
  • @Yannis: While I do realize that this was a "borderline" question due to its inherent fuzziness, why is "close it" always the immediate response to something which doesn't quite have a single verifiable answer?-- In fact, why is e.g. [another question of mine](http://programmers.stackexchange.com/q/315554/105784) about to be closed even when I explicitly ask for a very specific type of information and give an analogy from another, related data source? Note that e.g. [on the Bicycles SE, these types of open-ended questions are welcomed](http://bicycles.stackexchange.com/q/36948/8685). – errantlinguist Apr 15 '16 at 09:32
  • @errantlinguist There isn't enough space in comments for a proper answer, how about we move it to [Meta](http://meta.programmers.stackexchange.com/)? And let everyone join in the discussion? – yannis Apr 15 '16 at 09:47
  • @Yannis: Sure, I'd be up for some meta-SE. – errantlinguist Apr 15 '16 at 21:07

9 Answers9

36

It seems you are looking for shortcuts not to try out the "purest naive implementation" first, and directly implement a "more sophisticated solution because you know beforehand that the naive implementation will not do it". Unfortunately, this will seldom work — when you do not have hard facts or technical arguments to prove that the naive implementation is or will be too slow, you are most probably wrong, and what you are doing is premature optimization. And trying to argue with Knuth is the opposite of having a hard fact.

In my experience, you will either have to bite the bullet and try the "naive implementation" first (and will probably be astonished how often this is fast enough), or you will at least make a rough estimation about the running time, like:

"The naive implementation will be O(n³), and n is bigger than 100.000; that will run some days, while the not-so-naive implementation will run in O(n), which will take only a few minutes".

Only with such arguments at hand you can be sure your optimization is not premature.

There is IMHO only one exception from this: when the faster solution is also the simpler and cleaner one, then you should use the faster solution right from the start. The standard example is the one of using a dictionary instead of a list to avoid unnecessary loop code for lookups, or the usage of a good SQL query which gives you exactly the one result record you need, instead of a big resultset which has to be filtered afterwards in code. If you have such a case, do not argue about performance - the performance might be an additional, but most probably irrelevant benefit, and when you mention it, people might be tempted to use Knuth against you. Argue about readability, shorter code, cleaner code, maintainability - no need to "mask" anything here, but because those (and only those) are the correct arguments here.

To my experience, the latter case is rare - the more typically case is one can first implement a simple, naive solution which is better understandable and less error prone than a more complicated, but probably faster one.

And of course, you should know the requirements and the use case well enough to know what performance is acceptable, and when things become "too slow" in the eyes of your users. In an ideal world, you would get a formal performance spec by your customer, but in real world projects, required performance is often a grey area, something your users will only tell you when they note the program behaves "too slow" in production. And often, that is the only working way of finding out when something is too slow — the user feedback, and then you do not need to cite Knuth to convince your teammates that their "naive implementation" was not sufficient.

Doc Brown
  • 199,015
  • 33
  • 367
  • 565
  • While good information, this doesn't actually answer my question on how to work with people who stonewall a discussion the minute it has to do with performance. – errantlinguist Apr 12 '16 at 17:44
  • 16
    @errantlinguist: maybe did I not make myself clear, or it is simply not what wanted to hear? My advice is: do not try using *philosophical arguments" of Knuth about "3%" or "97%". Keep it factual, otherwise your colleagues are most probably right that your performance arguments are inappropriate. – Doc Brown Apr 12 '16 at 18:34
  • Okay, that *does* make it clearer. However, I fear that I wouldn't even get a chance to "prove" that an alternative to a naive implementation is better because, as I stated, some people seem to kill any discussion about "performance" or "optimization" before they even hear it. – errantlinguist Apr 12 '16 at 18:57
  • 4
    @errantlinguist: in the case you described in your comment to Karl Bielefeld's answer, you seem to have all arguments on your side without the need for using the "performance". I would go a step further and say, if you argue about performance in such a case, you make a tremendous mistake, because your colleagues are right: performance does typically not matter there. Argue about simplicity, readability, maintainability, less lines of code, but not(!) about performance, not even as a side note. Don't offer them the possibilty of using Knuth against you. – Doc Brown Apr 12 '16 at 19:02
  • So, combined with my own experiences regarding avoiding the word "optimization", it seems that basically the way to deal with these people is simply to never mention it and to bury it inside other topics like "more maintainable architecture", "smaller codebase", etc.? – errantlinguist Apr 12 '16 at 19:25
  • 2
    @errantlinguist: not *bury* it - put those aspects into focus, when it is correct that those aspects should be in focus, and do not use performance as an argument when you cannot **prove** it that it makes an important difference for the end user. – Doc Brown Apr 12 '16 at 20:07
  • 2
    @errantlinguist: I'm not sure how you reach that conclusion. Doc Brown's answer seems perfectly clear: you cut through these unproductive arguments that you are having with your colleagues by sticking to factual statements about what is and isn't acceptable performance. – jl6 Apr 12 '16 at 20:22
  • @DocBrown: Thanks for the advice. If you make that into an answer, I'll try to accept it before this question gets closed. – errantlinguist Apr 12 '16 at 20:48
  • @errantlinguist: if I make what into an answer? Do you mean I should add something from my comments into my answer which is missing there? Oh, and I bet chances are not bad your question will be reopened again. – Doc Brown Apr 12 '16 at 20:52
  • Oops, yeah. I meant that the explanation of what to focus on in a discussion really helped me. I'll accept this answer then, anyway, since the comments explain it. – errantlinguist Apr 12 '16 at 20:54
  • 1
    This is good advice for programming-in-the-small, but ignoring performance questions at the level of architectural design can lead a team down a long dead-end, because it might get a lot done before it is forced to confront the problem, and there is no guarantee that much of that work is reusable when the problem is architectural (I have seen it kill a product.) I know you have an exception in your answer, but to know whether it applies you still have to ask the question, and even asking the question is apparently anathema to errantlinguist's co-workers. – sdenham Apr 13 '16 at 12:42
  • @sdenham: at the level of architectural design, you need to make estimations for making the right performance decisions, exactly what I wrote in my question. If you are unsure, it will be a good idea to test an architecture by implementing a prototype or spike before picking an unnecessary complicated solution. Moreover, the question is not IMHO aiming at that level. – Doc Brown Apr 13 '16 at 13:36
  • 1
    ... moreover, on an architectural level, things are not black-and-white - you can often find compromises. For example, lets assume we picked a programming language like Python, and now find out it is too slow for a certain task, but using a C compiler would help. Sure, you cannot easily port the Python program to C afterwards when you have already written 100K LOC, but often it is possible to implement that one task in C which bothers you. Even in that situation, you start with the Python implementation **first**, try & measure, and then optimize afterwards, not the other way round. – Doc Brown Apr 13 '16 at 13:56
  • 1
    I second @sdenham: There is no premature optimization while designing the fundamental architecture. If you analyse the problem and ignore all performance considerations, your design *will* be slow, and it will be impossible to fix it without rewriting the majority of code. You really only have one shot to get your design right, and if you don't, you waste a lot of implementation effort. Later on in the process, when you are implementing the design, premature optimization exists and needs to be avoided, but the gains/losses of a slow/fast design are so huge that they must not be ignored. – cmaster - reinstate monica Apr 25 '16 at 09:14
  • @cmaster: IMHO discussing this in this broad generality does not lead us anywhere. There is not "the one and only correct way" to find a good architecture. And I did not write you should ignore performance considerations when picking or building an architecture, quite the opposite - read my comments again. And as sidenote: the OP had a quite smaller scenario in mind, what you can see if you read his comments to the other answers. – Doc Brown Apr 26 '16 at 06:20
18

Ask yourself this:

  • Is the software NOT meeting performance specification?
  • Does the software HAVE a performance issue?

These are reasons to optimize. So, if people are opposed, just show them the specification and go back to them and explain we need to optimize because we are not meeting spec. Other than that, one would have a hard time convincing others that optimization is necessary.

I think the main point of the quote is, if you don't have a problem, don't perform needless optimization as time and energy could be spent elsewhere. From a business prospective, this makes perfect sense.

Secondary, for those who fear optimization, always back up performance findings with metrics. How much faster is the code? How much did the performance improve over previous? If one spent two weeks only to improve code by 2% over previous version, if I were your boss I would not be happy. Those two weeks could have been spent implementing a new feature that could attract more customers and make more money.

Finally, most software does not have to be highly optimized. Only in a few specialized industries is speed really important. So, most of the time one can use pre-existing libraries and frameworks to good effect.

Jon Raynor
  • 10,905
  • 29
  • 47
  • 5
    While good information, this doesn't actually answer my question on how to work with people who stonewall a discussion the minute it has to do with performance. – errantlinguist Apr 12 '16 at 17:41
  • 8
    I agree with all of this except "Only in a few specialized industries is speed really important." I think you underestimate the amount of software that has performance issues from the customer's perspective. – Gort the Robot Apr 12 '16 at 17:48
  • @StevenBurnap: Yep-- are there web applications in the wild which actually *aren't* slow?-- I'd like to see one in the same of science. – errantlinguist Apr 12 '16 at 17:57
  • 1
    google.com is pretty fast. :-P – Gort the Robot Apr 12 '16 at 17:58
  • Try using google.com on an EDGE mobile connection. Yes, that's a ridiculous edge case, but it will definitely not be *pretty fast*. ;) (Pun actually not intended-- really) – errantlinguist Apr 12 '16 at 18:07
  • @Steven - Sorry I didn't mean to dismiss performance problems of any software application. I guess I should have used an example. The speed of adding items to a shopping cart versus a real time missile tracking system. If it takes an extra second to add an item to the cart, not a big deal, but a second with a missile traveling at near the speed of sound would have a much bigger implication. So, speed of processing is vital in that scenario. – Jon Raynor Apr 12 '16 at 18:12
  • @errantlinguist In that case, it's not google.com's problem. It's a connectivity problem: *nothing* works on an EDGE connection. I consider the "E" symbol the same as "no connection" ;) More seriously: it wouldn't be an optimization problem, since nothing they could do at Google to improve their website would change the fact EDGE doesn't work for anything. – Andres F. Apr 12 '16 at 20:48
  • @AndresF. There are *plenty* of things they could do to improve their website for a weak connection, like further minification, moving scripts and styles to other URLs to be loaded asynchronously, reducing the number of search results per page, etc. Those might not be *good* ideas, but they would certainly improve performance. – Andrew Apr 13 '16 at 00:19
  • Even in those "specialized industries", real time embedded software for example, the naive approach should typically be implemented first so that you have something to benchmark against and the tests (you *do* have those, *right*?) will protect you when you implement the more advanced and faster version. – RubberDuck Apr 13 '16 at 01:19
  • @AndrewPiliser Not for EDGE, no. Absolutely nothing works for EDGE, because it effectively means "no connection" :) No amount of optimizations can compensate for that fact. – Andres F. Apr 13 '16 at 02:23
  • @RubberDuck In the real-time case, a naive implementation of the functional requirements may not even work, on account of the timing constraints. Also, while a simple implementation is often useful in testing, it does not set the performance benchmarks - those are determined by the problem domain (and should be specified in the requirements.) – sdenham Apr 13 '16 at 13:06
9

How to work with people who stonewall a discussion the minute it has to do with performance?

Begin with shared principles that build on the strategic direction of your group.

My Principles:

My personal principles on writing code are to first aim for correctness in my program, then to profile it and determine if it needs optimization. I profile my code myself because other programmers are potential consumers of my code - and they will not use it if it is slow - thus for my code, speed is a feature.

If your consumers are customers, your customers will tell you if you need faster code.

However, there are known, demonstrably better choices in code that one can make. I would rather get it right in in my first draft for several reasons:

  1. If I get it right the first time, then I can forget about the implementation (taking advantage of information hiding), and I don't clutter up my code with TODOs.
  2. Others (particularly those who only learn on the job) see it done the right way, and they learn from and use the same style of code going forward. Conversely, if they see me do it the wrong way, they'll do it the wrong way too.

Assuming the need for optimization is correct

Assuming this is a truly important part of your code that needs optimization, you could tell the parable of Schlemiel the Painter, or emphasize the remainder of the quote:

"Programmers waste enormous amounts of time thinking about, or worrying about, the speed of noncritical parts of their programs, and these attempts at efficiency actually have a strong negative impact when debugging and maintenance are considered. We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil. Yet we should not pass up our opportunities in that critical 3%." - Donald Knuth

Weigh the costs of additional complexity

Sometimes there's a real cost in terms of the maintainability of the added complexity. In this case, you might keep the secondary implementation in a different function or subclass, and apply the same unittests to it so that there is no question that it is correct. Later, if you profile your code and find the naïve implementation to be a bottleneck, you can switch in your optimized code and demonstrably improve your program.

Leadership

Sometimes the problem is ego - some people would rather use suboptimal or buggy code than have someone else be more right than they are. You probably want to avoid working with these people.

Leadership, especially when you do not have positional authority over people, is about making reasonable suggestions and guiding others to a consensus. If you can't guide your team to a meeting of the minds, perhaps the matter is not worth pressing. There's probably bigger fish to fry.

Robert Harvey
  • 198,589
  • 55
  • 464
  • 673
Aaron Hall
  • 5,895
  • 4
  • 25
  • 47
6

The way forward is to forget about the actual quote and the various interpretations - it it dogmatism either way to focus so much on a specific quote by a guru. Who says Knuth is always right anyway?

Instead focus on the project at the hand, the piece of software you are developing along with the colleagues you disagree with. What is the requirements for acceptable performance for this piece of software? Is it slower than this? Then optimize.

You don't have to call it "optimization", you can call it "fixing a bug", since it is by definition a bug if the implementation fails to conform to the requirements.

More generally, there are two possibilities regarding optimizations:

  1. The optimized code is also shorter, simpler to understand and easier to maintain.

  2. The optimized code is more complex to understand, takes a longer time to write and test, or would be more complex to change in the future if requirements change in unexpected ways.

If the case is (1) you don't even have to argue about optimization. But if (2) is the case, then you are engaging in a trade-off decision. This is actually a business level decision, not purely a technical decision. You have to weigh the cost of the optimization against the benefit. In order for there to even be a benefit, the inefficiency has to be problem in the first place, either as bad user experience or significantly increased cost of hardware or other resources.

JacquesB
  • 57,310
  • 21
  • 127
  • 176
  • Well, I fully agree to your initial sentence. However, I am pretty sure a piece of software can be "annoyingly slow for the intended use case" without having the performance requirements specified explicitly in a formal way. – Doc Brown Apr 12 '16 at 16:35
  • @DocBrown: Of course, but in any case it is the customer who decides if it is too slow or not, not the developer. – JacquesB Apr 12 '16 at 16:51
  • I have never come across business requirements which explicitly state performance requirements. – errantlinguist Apr 12 '16 at 17:43
  • @errantlinguist: In my experience it is pretty common in customer-focused businesses like online shops. But for tools and applications for internal use in a company, user experience is usually not a concern for the project owner. – JacquesB Apr 25 '16 at 10:07
4

I think the full quote in context is instructive. I'll copy from a post I made on Reddit on the topic:

There is no doubt that the grail of efficiency leads to abuse. Programmers waste enormous amounts of time thinking about, or worrying about, the speed of noncritical parts of their programs, and these attempts at efficiency actually have a strong negative impact when debugging and maintenance are considered. We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil.

Yet we should not pass up our opportunities in that critical 3%. A good programmer will not be lulled into complacency by such reasoning, he will be wise to look carefully at the critical code; but only after that code has been identified.

-- Donald Knuth, Structured Programming with go to Statements, ACM Computing Surveys, Vol 6, No. 4, Dec. 1974, p.268

The point, and implication, is that there are more important things to worry about than putting your attention to optimization too early. Certainly, you should carefully consider your data structures and algorithms (this is in the 3%) but you shouldn't worry about whether subtraction is faster than modulo (this being in the 97%) until it becomes clear that low-level optimization is necessary.

The former is not necessarily optimization in the sense that your colleagues are thinking, but it is optimization in the sense that poorly-chosen algorithms and data structures are suboptimal.

greyfade
  • 11,103
  • 2
  • 40
  • 43
  • 3
    One might add that Knuth clearly does not think that analyzing the time complexity of algorithms, and making design choices on that basis, is premature optimization. – sdenham Apr 13 '16 at 12:50
4

In my experience, if you get this kind of opposition to optimization regularly, people are not really complaining about optimization. They are complaining about what you are sacrificing in the name of optimization. This is usually readability, maintainability, or timeliness. If your code is delivered in the same amount of time, and just as easy to understand, people couldn't care less if you're using more efficient data structures and algorithms. My suggestion in this case is to work on making your code more elegant and maintainable.

If you're getting this kind of opposition in regards to other people's code, it's usually because you're suggesting a significant amount of rework. In those cases you really do need some actual measurements to show it's worth the effort, or perhaps try to get involved earlier in the design phase, before code is written. In other words, you need to prove it's in the 3%. If we rewrote all the code that wasn't exactly how we liked it, we'd never get anything accomplished.

Karl Bielefeldt
  • 146,727
  • 38
  • 279
  • 479
  • Unfortunately, I've actually done the opposite case, where I use e.g. a `Deque` from the Java standard library in order to replace a huge amount of logic built around an `ArrayList` used as a stack while working on the code... and this was marked for change in review. In other words, the reviewer wanted to have more code which is also slower and more prone to bugs because he wasn't familiar with `Deque`. – errantlinguist Apr 12 '16 at 17:49
  • 3
    Being unwilling to learn something that's been in your language for 10 years is a toxic attitude for a programmer, and a much deeper problem than you originally described. Personally, in that situation I would refuse the advice, and escalate it to management if need be. – Karl Bielefeldt Apr 12 '16 at 18:10
  • 5
    @errantlinguist: when your reviewer suggested a clearly worse (that means more complicated) solution as a replacement for a clean and simple one, you should argue about that. Do not argue about performance! Seriously, never even use the word "performance" in that discussion. Argue only about readability and maintainability. And if the reviewer insists on his bad code, escalate. – Doc Brown Apr 12 '16 at 18:52
  • +1 Not sure why this answer has downvotes instead of upvotes plus being the accepted answer. It suggests both a way to handle the problem, plus an analysis of what the real underlying problem might be (i.e. that nobody wants to be told their code must be radically re-written). – Andres F. Apr 12 '16 at 20:52
2

There is indeed lots of misunderstandings about this quote, so it's best to step back and look at what the actual issue is. The issue isn't so much that you should never "optimize". It's that "optimize" is never a task you should set out to do. You should never wake up in the morning and say to yourself "hey, I should optimize the code today!".

This leads to wasted effort. Just looking at code and saying "I can make it faster!" leads to lots of effort making something faster that was fast enough in the first place. You might find pride in telling yourself that you made a bit of code four times faster, but if that code was a calculation that happened on a button press, and it took 10 msecs in the first place before displaying to a human user, no one's going to give a damn.

That is the "premature" in "premature optimization". When is it not "premature"? When customers tell you "this is too damn slow, fix it!" That's when you dig in the code and try to make it faster.

This doesn't mean that you should turn off your brain. It doesn't mean that you should keep 10,000 customer records in a singly linked list. You should always understand the performance impacts of what you do in mind and act accordingly. But the idea is that you are not spending extra time deliberately trying to make it faster. You are simply choosing the more performant choice out of otherwise equal choices.

Gort the Robot
  • 14,733
  • 4
  • 51
  • 60
  • *This doesn't mean that you should turn off your brain. It doesn't mean that you should keep 10,000 customer records in a singly linked list.* > While I agree with you 100% on that, I have actually seen linked lists used in that way, and was told "not to touch it". – errantlinguist Apr 12 '16 at 17:38
  • While good information, this doesn't actually answer my question on how to work with people who stonewall a discussion the minute it has to do with performance. – errantlinguist Apr 12 '16 at 17:43
  • 4
    Sadly, the "singly linked list" thing was not a random example but something I ran into personally. – Gort the Robot Apr 12 '16 at 18:00
1

This seems like a communication problem and not a programming problem. Try to understand why people feel the way they do and try to crystallize why you think your way would be better. When you've done that, don't start a confrontational argument where your goal is to tell others why they're wrong and you're right. Explain your thoughts and feelings and just let people react to that. If you can't reach any sort of consensus and you feel like this is a really critical issue, then you probably have some serious issues in the team overall.

More focused on actual programming, don't waste time on long arguments over something you just have a gut feeling is "faster". If you see someone writing a method that is called once per request in a web app and it has O(n^2) time complexity when you KNOW it's really a O(log(n)) time problem, then sure, if it's such a no brainer, go ahead.

Be aware though, as humans, we programmers are really bad (and I mean AWFUL) at guessing which parts of our applications that will bottleneck. Eric Lippert writes about this interesting subject in this blog post. Always favor maintainability. Any performance issues that eventually are found can then easily (well, relatively) be fixed when you have more info.

sara
  • 2,549
  • 15
  • 23
  • I editet the answer and fleshed stuff out a bit more, could the downvoter add some feedback? :) – sara Apr 12 '16 at 18:11
  • Although I'm not the downvoter, your first paragraph is spot-on in addressing the question at hand, but the rest doesn't actually answer my question on how to work with people who stonewall a discussion the minute it has to do with performance (although it is still good advice). – errantlinguist Apr 12 '16 at 19:06
  • basically what I want to get across in the last two paragraphs is "those optimizations might not even matter". in that case, it's better to pick your fights. – sara Apr 12 '16 at 19:07
1

You can either do things the wrong way, or do things the right way.

Often, things are done the wrong way and code is refactored so that it's done the right way. If you're going to write new code, and you know that you can do things the right way without a major penalty, I'd just err on the side of doing it the right way. (Note that, after performance testing, etc, some things might need to change - but that's okay. Alternatively, a completely naive implementation is rarely, if ever, right.)

It's not necessarily premature optimization if you a) know that it will help in the future or b) know that the suboptimal path will lead to problems down the road. It's like a chess game, really.

I think that people will tend to want to do things right, rather than do them wrong. Use this when you discuss alternative strategies with your peers.

  • 5
    There is never "the wrong way" or "the right way". There are generally an infinite number of ways that run in a continuum from "My God, how does this even run!?" to "John Carmack and Donald Knuth could not make this better while pair programming". – Gort the Robot Apr 12 '16 at 17:43
  • @StevenBurnap This is true. However, I think that for individuals, there's generally a few right ways and a lot of wrong ways. (As we become better programmers, that spectrum begins to shift - our old "right ways" may sometimes become our new "wrong ways", while our new right ways are better than the old ones.) I think it's good to do things in the best, most right way possible *for you*. This leads us to becoming better programmers, becoming better teammates (mentoring matters!), and writing better code. –  Apr 13 '16 at 01:32
  • "*I think that people will tend to want to do things right, rather than do them wrong*" -- The problem is, that there are *very* different opinions about what is right or wrong. Some even start holy wars about it (in the literal sense). – JensG Apr 25 '16 at 09:39