20

I'm dealing, again, with a messy C++ application, tons of classes with confusing names, objects have pointers into each other and all over, longwinded Boost and STL data types, etc. (Pause and consider your favorite terror of messy legacy code. We probably have it.) The phrase "code rot" oft comes to mind when I work on this project.

Is there a quantitative way to measure code rot? I wouldn't expect anything highly meaningful or scientific, since no other measure of code productivity or quality is so fine. I'm not looking for a mere opposite of measures of code quality, but specifically a measure of how many bad things happened after a series of maintenance software "engineers" have had turns hacking at the code.

A general measure applying to any language, or many languages, would be great. If there's no such thing, at least for C++, which is a better than average language for creating messes.

Maybe something involving a measure of topology of how objects connect during runtime, a count of chunks of commented out code, how mane files a typical variable's usage is scattered over, I don't know... but surely now, a decade into the 21st Century, someone has attempted to define some sort of rot measure.

It would be especially interesting to automate a series of svn checkouts, measure the "rottenosity" of each, and plot the decay over time.

DarenW
  • 4,433
  • 3
  • 22
  • 43
  • 38
    The only scientific unit of measure for this is the OMGWTF/sec – Robert Massaioli Feb 23 '11 at 03:39
  • @Robert, clever :) – Marlon Feb 23 '11 at 04:00
  • 1
    This might be better on programmers.stackexchange. –  Feb 23 '11 at 04:00
  • 1
    I've always wanted to have expiry dates on code. When code hasn't has a review in one year, it needs to either stop compiling or give off a sulfer smell like a bad egg. – smithco Feb 23 '11 at 06:00
  • What would you use this measure for? To get more resources from management because your code is more rotten than others? – LennyProgrammers Feb 23 '11 at 09:17
  • 1
    @smithco: I once wrote a macro that did exactly that. `EXPIRE(yyyymmdd)` would typedef an array whose size depended on `__DATE__` and `yyyymmdd`. For instance, `EXPIRE(20120224)` would today typedef an array with size +10000, but in two years time it will typedef an array with size -10000. – MSalters Feb 24 '11 at 00:32
  • This may help: http://stackoverflow.com/questions/1433632/is-there-a-findbugs-and-or-pmd-equivalent-for-c-c – user43552 Dec 22 '11 at 10:32
  • 1
    not an answer, but I call this issue the technical debt. Code do not rot by itself with time. Its quality decreases when work on it is done as quickly as possible without thinking about the guy who will end up maintaining the resulting mess. – Simon Bergot Dec 22 '11 at 13:12

3 Answers3

9

https://i.stack.imgur.com/ARBSs.jpg

Which door represents your code?...

...Which door represents your team or your company? Why are we in that room? Is this just a normal code review or have we found a stream of horrible problems shortly after going live? Are we debugging in a panic, poring over code that we thought worked? Are customers leaving in droves and managers breathing down our necks? How can we make sure we wind up behind the right door when the going gets tough? (Robert "Uncle Bob" Martin, Clean Code: A Handbook of Agile Software Craftsmanship)


Any quality measure / metric is at risk of being useless unless backed by code reviews. In my experience I haven't yet seen a metric that was impossible to trick into 100%-perfect-digits garbage. On the other hand I can't recall a code review where I was able to trick reviewer (though have to admit I never tried it hard).

gnat
  • 21,442
  • 29
  • 112
  • 288
7

This list has a whole bunch of metrics that can be used to measure one aspect or another of code rot.

McCabe IQ uses a vast number of software metrics to get the most precise assessment of your application's quality, security, and testing. We've defined these metrics below for your reference. Further details on many of these metrics can be found in the NIST document- "Structured Testing: A Testing Methodology Using the Cyclomatic Complexity Metric" by Arthur Watson and Tom McCabe...

gnat
  • 21,442
  • 29
  • 112
  • 288
Karl Bielefeldt
  • 146,727
  • 38
  • 279
  • 479
  • Consider using Sonar [http://www.sonarsource.org/] with it's C/C++ plugin to help measure these – deterb Jan 03 '12 at 22:37
4

Cyclomatic Complexity Metric (one definition is here) is one metric used to measure the ugliness of code. Tools from companies like Coverity can measure the CCM of a set of code. I don't work for Coverity but have used the product for defect analysis (we don't pay much attention to the CCM).

The best answer to this question may be the

OMGWTF/sec

given earlier by Robert above.

Lou
  • 171
  • 3