52

Usually I just throw my unit tests together using copy and paste and all kind of other bad practices. The unit tests usually end up looking quite ugly, they're full of "code smell," but does this really matter? I always tell myself as long as the "real" code is "good" that's all that matters. Plus, unit testing usually requires various "smelly hacks" like stubbing functions.

How concerned should I be over poorly designed ("smelly") unit tests?

Buttons840
  • 1,856
  • 1
  • 18
  • 28
  • 8
    How hard can it be to fix up the code you're always copying? – JeffO May 18 '11 at 20:39
  • 7
    code smell in the tests could easily be an indicator of hidden smell in the rest of the code - why approach the tests as if they are not "real" code? – HorusKol May 18 '11 at 23:51

12 Answers12

75

Are unit test smells important? Yes, definitely. However, they are different from code smells because unit tests serve a different purpose and have a different set of tensions that inform their design. Many smells in code don't apply to tests. Given my TDD mentality, I would actually argue that unit test smells are more important than code smells because the code is just there to satisfy the tests.

Here are some common unit testing smells:

  • Fragility: do your tests fail often and unexpectedly even for seemingly trivial or unrelated code changes?
  • State Leak: do your tests fail differently depending on, for instance, what order they are run?
  • Setup/Teardown Bloat: Are your setup/teardown blocks long and growing longer? Do they perform any sort of business logic?
  • Slow Runtime: Do your tests take a long time to run? Do any of your individual unit tests take longer than a tenth of a second to run? (Yes, I'm serious, a tenth of a second.)
  • Friction: Do existing tests make it difficult to write new tests? Do you find yourself struggling with test failures often while refactoring?

The importance of smells is that that they are useful indicators of design or other more fundamental issues, i.e. "where there's smoke, there's fire". Don't just look for test smells, look for their underlying cause as well.

Here, on the other hand, are some good practices for unit tests:

  • Fast, Focused Feedback: Your tests should isolate the failure quickly and give you useful information as to its cause.
  • Minimize Test-Code Distance: There should be a clear and short path between the test and the code that implements it. Long distances create unnecessarily long feedback loops.
  • Test One Thing At A Time: Unit tests should only test one thing. If you need to test another thing, write another test.
  • A Bug Is A Test You Forgot To Write: What can you learn from this failure to write better, more complete tests in the future?
Rein Henrichs
  • 13,112
  • 42
  • 66
  • 2
    "Unit tests serve a different purpose and have a different set of tensions that inform their design". For example, they should DAMP, not necessarily DRY. – Jörg W Mittag May 18 '11 at 23:47
  • 3
    @Jörg Agreed, but does DAMP actually stand for anything? :D – Rein Henrichs May 19 '11 at 00:51
  • 5
    @Rein: Descriptive and Meaningful Phrases. Also, not completely DRY. See http://codeshelter.wordpress.com/2011/04/07/dry-and-damp-principles-when-developing-and-unit-testing/ – Robert Harvey May 19 '11 at 01:54
  • +1. I also didn't know the meaning of DAMP. – egarcia May 19 '11 at 10:15
  • I'm not too sure about the slow runtime smell... It very much depends on what you are testing. Many unit tests for the UI will fail this. Also, I have a lot of tests that involve SOAP calls. They will always take longer than a full second. I suppose you could say they are "integration tests", but I still group these with my more conventional unit tests. – Morgan Herlocker Jun 23 '11 at 13:46
  • @ironcode In my experience, long running tests don't get run often enough. There are techniques to improve test runtime. Collaborators should be faked (or it isn't, by definition, a unit test). It's perfectly fine to have a suite of integration tests that take longer to run, but your main unit test suite should be extremely responsive. It's Test *Driven* Development, and the tests can't drive development if they take so long to run that you don't run them while developing. – Rein Henrichs Jun 23 '11 at 17:10
  • @Rein, I see what you mean, but often programming is integration development. This still benefits from TDD, even if it takes a few seconds every time to run certain tests. I spend quite a bit of time integrating disparate systems. If my unit tests were not primarily integration tests, I would not have too many tests at all, but would have no idea if I was being effective. – Morgan Herlocker Jun 23 '11 at 18:56
  • 2
    @ironcode If you can't work on one system in isolation without fear of breaking its integration with other systems, that smells like tight coupling to me. This is precisely the purpose of identifying test smells like long runtime: they inform you of design problems like tight coupling. The response shouldn't be "Oh, then that test smell is invalid", it should be "What does this smell tell me about my design?" Unit tests that specify the external interface of your system should be sufficient to tell you whether or not your changes break integration with its consumers. – Rein Henrichs Jun 23 '11 at 20:04
  • ... Otherwise, anyone developing a public API would have to ask its users to constantly run their clients against it. There would be no other way to guarantee that the API wasn't broken by your changes. Your feedback cycles would take *weeks* rather than *seconds* and all public API development would grind to a screeching halt. – Rein Henrichs Jun 23 '11 at 20:08
  • @Rein - I do not disagree that it is probably a code smell. However, often times a system's sole purpose might be to integrate to disparate systems. Think a soap service that exposes the API of a poorly written 3rd party application and attempts to make it easier to interface with. You cannot fix the 3rd party system (due to time, cost, or your users preferring to use that system), so the best you can do is write a reasonable wrapper, so that it can be safely interacted with. It is not a fun situation to be in, but it does require automated testing (if possible), and it is all too common. – Morgan Herlocker Jun 24 '11 at 13:10
  • @ironcode As I keep saying, you should fake the API your service is interacting with in your tests. There's absolutely no reason to couple tests to a slow, potentially unreliable external service and plenty of good reasons *not* to. – Rein Henrichs Jun 28 '11 at 20:23
  • You don't mention the huge cost of maintenance that copy/paste code will cause in the long run. In my experience keeping tests clean of debt is of fundamental importance to avoid code bankruptcy and having to give up on the test base a few months down the line. – Sklivvz Feb 23 '13 at 23:28
  • +1 I suggest adding a smell that tests with a lot of decision logic and calls to subroutines is a big red flag. The point is that tests should be **very** easy to read. If a test fails and you have to spend 5 minutes deciphering test code _before_ you even touch the production code - you have a big problem! Unfortunately obsession with DRY and tendency to fall into "old production coding style tends to lead to very smelly ***test*** code. – Disillusioned Sep 26 '13 at 20:39
  • @Sklivvz Copy/paste code in tests is not necessarily a problem in itself. It's generally far more important for test code to read naturally. However, if you do find yourself with "high maintenance" tests, that's more likely an indicator of other problems... Are you using a decent refactoring editor to make many trivial edits easy? Do have lots of almost identical tests? (Perhaps "congruent" tests would be a good term i.e. no significant variation between tests E.g. 1+1=, 1+2=, 1+3=, 1+4, etc.) Do you have the right split between Test Classes, Test SetUp, and actual test code? – Disillusioned Sep 26 '13 at 20:52
  • "Test One Thing At A Time". This is very important for creating maintainable tests, and is the one rule that I see being violated most frequently. We need more people shouting this one from the rooftops! – Doctor Jones Jun 21 '18 at 14:02
67

Be concerned. You write unit tests to prove your code acts the way you expect. They allow for you to refactor quickly with confidence. If your tests are fragile, difficult to understand, or if they are difficult to maintain, you will ignore failing tests or turn them off as your code base evolves, negating many of the benefits of writing tests in the first place.

jayraynet
  • 1,200
  • 9
  • 13
  • 18
    +1 The moment you start ignoring failing tests they don't add any value. –  May 19 '11 at 06:12
19

I just finished reading The Art of Unit Testing a couple of days ago. The author advocates putting as much care into your unit tests as you do your production code.

I've experienced poorly-written, unmaintainable tests firsthand. I've written some of my own. It's virtually guaranteed that if a test is a pain in the ass to maintain, it won't be maintained. Once the tests are out of sync with the code under test, they've become nests of lies and deceit. The whole point of unit tests is to instill confidence that we haven't broken anything (that is, they create trust). If the tests can't be trusted, they're worse than useless.

Joshua Smith
  • 2,050
  • 18
  • 24
7

As long as your unit test actually tests your code in "most" cases. (Said most on purpose since its hard sometimes to find all possible outcomes). I think "smelly" code is your personal preference. I always write code in a way I can read it and understand it within few seconds, rather than dig through junk and try to understand what is what. Especially when you come back to it after a significant amount of time.

Bottom line - its testing, it suppose to be easy. You don't want to confuse yourself with "smelly" code.

Luke
  • 289
  • 2
  • 7
  • 5
    +1: Don't fuss over the unit test code. Some of the dumbest questions revolve around making unit test code more "robust" so that a programming change doesn't break the unit test. Silliness. The unit tests are designed to be brittle. It's okay if they're less robust than the application code they're designed to test. – S.Lott May 18 '11 at 19:57
  • 1
    @S.Lott Both agree and disagree, for reasons better explained in my answer. – Rein Henrichs May 18 '11 at 20:25
  • 1
    @Rein Henrichs: those are far, far more serious smells than described in the question. "copy and paste" and "looking quite ugly" don't sound as bad as the smells your describing where the tests are unreliable. – S.Lott May 18 '11 at 20:30
  • 1
    @S.Lott exactly, I think if we're going to talk about "unit testing smells" that it's crucial to make that distinction. Code smells in unit tests often aren't. They are written for different purposes and the tensions that make them smelly in one context are very different in the other. – Rein Henrichs May 18 '11 at 20:34
  • 2
    @S.Lott btw this seems like a perfect opportunity to use the much neglected chat feature :) – Rein Henrichs May 18 '11 at 20:36
  • @Rein Henrichs: "Code smells in unit tests often aren't" Agreed. Unless, of course, one runs afoul of the list you provided, which are very serious code smells in unit test code. – S.Lott May 18 '11 at 20:38
  • @S.Lott very serious and very common. They also aren't "code smells" per se, they're really "test smells" IMO. They're unique to the tensions of writing good unit tests. – Rein Henrichs May 18 '11 at 20:41
6

Definitely. Some people say "any test is better than no test at all". I strongly disagree - badly written tests bog down your development time, and you end up wasting days fixing "broken" tests because they weren't good unit tests in the first place. For me at the moment, the two things I'm focusing on to make my tests valuable, rather than a burden, are:

Maintainability

You should be testing the outcome (what happens), not the method (how it happens). Your set-up for the test should be as decoupled from the implementation as possible: only set up results for service calls etc that are absolutely necessary.

  • Use a mocking framework to ensure your tests don't depend on anything external
  • Favour stubs over mocks (if your framework distinguishes between them) wherever possible
  • No logic in tests! Ifs, switches, for-eaches, cases, try-catches etc. are all big no-nos, as they can introduce bugs into the test code itself

Readability

It's okay to allow a bit more repetition in your tests, which you wouldn't normally allow in your production code, if it makes them more readable. Just balance this out with the maintainability stuff above. Be explicit in what the test is doing!

  • Try to maintain an "arrange, act, assert" style for your tests. This separates out your set up and expectations of the scenario, from the action being performed, and the result being asserted.
  • Maintain one logical assertion per test (if the name of your test has "and" in it, you may need to break it down into multiple tests)

In conclusion, you should be very concerned with "smelly" tests - they can end up just a waste of your time, providing no value.

You've said:

Unit testing usually requires various "smelly hacks" like stubbing functions.

Sounds like you could definitely do with reading up on some Unit Testing techniques, such as using a Mocking framework, to make your life a heck of a lot easier. I'd very very strongly recommend The Art of Unit Testing, which covers the above and much much more. I found it enlightening after struggling with poorly-written, unmaintainable, "smelly" tests for a long time. It's one of the best time investments I've made this year!

mjhilton
  • 653
  • 3
  • 9
  • Excellent answer. I have just one small quibble: far more important than "one logical assertion per test" is one action per test. The point is no matter how many steps it takes to arrange a test, there should only be one "production code action under test". If said action should have more than one side-effect, you may well have multiple assertions. – Disillusioned Sep 26 '13 at 21:01
5

Two questions for you:

  • Are you absolutely positive that you are testing what you think you are testing?
  • If someone else looks at the unit test, will they be able to figure out what the code is supposed to do?

There are ways of dealing with repetitive tasks in your unit tests which are most commonly setup and teardown code. Essentially, you have a test set up method and a test tear down method--all unit test frameworks have support for this.

Unit tests should be small and easily understood. If they aren't and the test fails, how are you going to fix the problem in a reasonably short period of time. Make it easy on yourself for a couple months down the road when you have to come back to the code.

Berin Loritsch
  • 45,784
  • 7
  • 87
  • 160
5

When the trash starts smelling, it's time to take it out. Your test code should be as clean as your production code. Would you show it to your mother?

Bill Leeper
  • 4,113
  • 15
  • 20
3

I almost didn't submit my answer as it took me awhile to figure out how to even approach this as a legitimate question.

The reasons programmers engage in "best practices" are not for aesthetic reasons, or because they want "perfect" code. It is because it saves them time.

  • It saves time because good code is easier to read and understand.
  • It saves time because when you need to find a fix a bug it's easier and faster to locate.
  • It saves time because when you want to extend the code it's easier and faster to do so.

So the question your asking (from my point of view) is should I save myself time or write code that will waste my time?

To that question I can only say, how important is your time?

FYI: stubbing, mocking, and monkey patching all have legitimate uses. They only "smell" when their use isn't appropriate.

dietbuddha
  • 8,677
  • 24
  • 36
3

In addition to the other answers here.

Poor quality code in unit tests is not restricted to your unit test suite.

One of the roles filled by Unit Testing is Documentation.

The Unit Test suite is one of the places one looks to find out how an API is intended to be used.

Callers of your API are not unlikely to copy parts of your unit test suite, leading to your bad test-suite code infecting live code elsewhere.

Paul Butcher
  • 2,817
  • 1
  • 18
  • 18
2

I try specifically to NOT make my unit tests too robusts. I have seen unit tests that start doing error handling in the name of being robust. What you end up with is tests that swallow up the bugs they are trying to catch. Unit tests also have to do some funky things pretty often to make things work. Think about the entire idea of private accessors using reflection...if I saw a bunch of those all over in production code, I would be worried 9 times out of 10. I think more time should be put into thinking about what is being tested, rather than code cleanliness. The tests will need to be changed pretty frequently anyway if you do any major refactoring, so why not just hack them together, have a little less ownership feelings, and be more motivated to rewrite or rework them when the time comes?

Morgan Herlocker
  • 12,722
  • 8
  • 47
  • 78
2

If you are going for heavy, low-level coverage you will be spending as much time or more in the test code as in the product code when doing any serious modifications.

Depending on the complexity of the test set-up the code there may be more complex. (httpcontext.current vs. the horrible monster of trying to build a false one accurately for example)

Unless your product is something where you rarely make a breaking change to existing interfaces and your unit-level inputs are very simple to set up I would be at LEAST as worried about the understanding of the tests as the actual product.

Bill
  • 8,330
  • 24
  • 52
0

Code Smells are only an issue in code that make need to be change or understood at some point later in time.

So I think you should fix code smells when you have to go "close" to the given unit test, but not before.

Ian
  • 4,594
  • 18
  • 28