6

My team is considering moving to test-driven development. Currently we have almost no unit testing, we have been only relying on the user acceptance test, and developer's own manual test. Not everyone have agreed with this proposal. In the team, no one really has an experience on unit testing, so it's hard to convince everyone the benefit of doing it. One of the objection is the time needed.

My question is, how much extra time is usually needed to add unit test, relative to the task itself? Let's say if the coding of a task (excluding acceptance test) takes half a day, how much time will the unit test take, in average? This estimation is important for us to explain the relative benefit of doing unit test, and to adjust our planning.

If it matters, we are mostly working with .NET desktop application (with a GUI and a database), and it is not safety-critical (I mean, it's not for a nuclear power plant or stuff like that). Also note that none of us have experience, so there will be a learning curve

Louis Rhys
  • 6,042
  • 11
  • 42
  • 59
  • 3
    Not sure that this is really a duplicate? The linked question talks about adding tests for bugs, *after* the code is written, this question talks about using tests to guide the design *before* the code is written. Those are significantly different, since, e.g. in the first case, code is often not designed to be testable, which makes writing tests a *very* tedious task whereas in TDD, that simply cannot happen, since the code hasn't even been written yet. Plus, there are scientific studies which *precisely* answer the *exact* question the OP asked but which are irrelevant to the linked one. – Jörg W Mittag Mar 17 '12 at 20:43
  • 1
    @JörgWMittag Both questions discuss time spent on tests without _clearly_ specifying if tests are written before or after code (at least I don't see it), to me they are duplicates. However it seems you have a good answer to this question, so I don't mind re-opening it, especially if your answer helps clarify why the two questions are not duplicates. – yannis Mar 18 '12 at 06:58
  • 1
    Louis please edit your question to clearly tell us why it's not a duplicate of [this one](http://programmers.stackexchange.com/questions/84491/how-long-should-we-generally-spend-writing-unit-tests-for-a-new-feature-or-bug-f), even if it's for the same reasons @JörgWMittag identified in his comment (comments are ephemeral). – yannis Mar 18 '12 at 06:59
  • @YannisRizos I think that the questions are similar in that they both ask how long unit testing should take relative to the task, this particular question is also defining a different context. The linked question comes from a point of view where the OP accepts unit testing as normal, and is asking to compare his long test to short implementation ratio to other developers. This question however asks to be convinced if unit testing is worth the **extra time** perceived by the OP, and essentially how to make a case to sell unit testing to his company. – S.Robins Mar 19 '12 at 06:33
  • @JörgWMittag I think that you've missed the point slightly in your comment. Please see my earlier answer to Yannis in which I argue the case from a perspective that I feel is a little closer to point. Also, it would be helpful if you could direct us to the scientific studies that you've mentioned in your comment, or to perhaps add an answer below if you feel your case offers another perspective to the answers already given. :-) – S.Robins Mar 19 '12 at 06:38
  • Possible duplicate of [How long should we generally spend writing unit tests for a new feature or bug fixing?](http://softwareengineering.stackexchange.com/questions/84491/how-long-should-we-generally-spend-writing-unit-tests-for-a-new-feature-or-bug-f) – gnat Apr 01 '17 at 10:07

2 Answers2

6

It's difficult to provide a definitive answer without better understanding the development processes that you usually use, the experience of your team, and more importantly, the complexity of the tasks at hand. To this I should also probably add the way in which you generally write your methods and the design choices you have made.

Unit testing always adds time to the development, and if you are testing your code properly, then you should be spending more time writing tests than implementation code. This sounds quite horrendous at first and is probably one of the main reasons why many choose to avoid unit testing in the first place. Even in my present workplace, it took me more than 4 years to convince everybody that their concerns over testing overheads were only as a result of not fully understanding the longer term benefits.

In the shorter term, and particularly when you first start unit testing, you find that your development time slows down considerably. This is a necessary learning period. You'll find most developers will be selective about where they apply unit tests, or will write their code first, and apply tests after the implementation is essentially complete. In doing so, they will find bugs, which will require a change in the implementation code, requiring a subsequent change in the tests. This will prove to be a very counter-productive process, and will make most developers wonder why they bother if they are always playing catch-up with the tests. The real epiphany will come when your developers start to code complete tests before writing the implementation code, because the resulting code will most likely be more concise, and will be much quicker to write.

An advantage of the test first approach is in the fact that the test will indicate the implementation code works at the exact moment that the code is written to satisfy the conditions of the test. Further, if the implementation code has been kept relatively concise it will likely be easier to modify with the tests, providing you with the real advantage of testing first, which is the assurance of a system that will identify the exact moment when the code has been modified to include an error. Modifications to code are much easier with supporting tests already in place, which significantly reduce your maintenance overheads and make debugging efforts much less onerous.

When you are very comfortable with unit testing and test first, then you will find that your estimates for tasks will become easier to make, and won't be messed up by lengthy and unexpected delays caused by bugs and the subsequently difficult to estimate debugging effort, as you will find that the majority of your potential bugs will have been dealt with simply by focusing your development efforts around testing. Where you may have previously found yourself dealing with a lengthy debugging process during the user acceptance testing period, you should in future find the post-implementation debugging phase, and user acceptance tests taking much less time.

So the real issue then becomes less about how much additional time testing adds, and more about a trade-off between the longer time to implement verses the shorter post implementation time spent debugging and maintaining your product, and also about how your test suite will benefit you when it comes time to change any part of your code base. More importantly, how all of this translates into a real and measurable value to your business. As a very rough example of what I am talking about, when my team first started seriously writing unit tests, I took some measurements relating to bug issues and time to develop modules for our product, and I added in a comparison between the amount of time working on new developments versus maintenance issues. Before test first unit testing, our maintenance efforts were triple the development efforts, whereas on those projects where we had moved entirely to unit testing first (and after a few months of experience), our effort expended on new development work is greater than the effort expended on maintenance. This amounts to a measurable saving with a real dollar value attached. It means that the team has been able to take on more work without needing to expend more cash and juggle resources.

How this will exactly translate into your own working environment is difficult to predict specifically, and I can only recommend having a couple of people act as method champions and work in a test-first manner for a while to get a real feel for how it will fit within you working environment. One suggestion would be to do a little experimental development. Have a couple of people work on a couple of similar tasks (or duplicating the same task if you can afford to allocate the time) for a short while and see what the difference is in terms of time to implement vs time to debug after completion, and conduct code reviews to examine code quality issues as each task progresses. Give it a little time, and see whether your teams objections diminish or become stronger.

Personally, I can't think of any real reasons why you shouldn't be unit testing, however I don't work in your particular environment so I can't really speak for you, and at the end of the day unit testing is merely a practice. It's a tool that if used well can provide you with great results, yet if the entire team is unable to use the tool, or if the use of a tool results in a pattern of failures within a team, then perhaps that tool isn't right for you. The only way you can know for certain is to try it and see for yourselves.

S.Robins
  • 11,385
  • 2
  • 36
  • 52
2

My experience has been that unit tests don't add much time; they save implementation time, in fact. If you write your code to be testable, you tend to write clearer, more modular code that you feel more confident is correct. You also end up with tests that prove correctness in the cases you've tested and demonstrate how you intend your methods to be called.

You stated that no one on your team has a lot of tesing experience yet. There will be some ramp up time before everyone feels confident in the tools and what makes a good test. On the team I work with, it took about two months before all of us fully understood how to split code down into testable chunks. The biggest challenge we had was learning to leave seams in our classes so that external dependencies, like database managers, could be injected for testing. The results were well worth it though - we ended up with a clear pattern of keeping database connection details separate from the code requiring them. Should we ever need to change our DB scheme, we've been able to limit the affected areas down to a single layer. Not only that, but all references to the DB can be found quickly using our IDE.

S.Robins
  • 11,385
  • 2
  • 36
  • 52
Michael K
  • 15,539
  • 9
  • 61
  • 93