I've seen multiple posts about rewrites of applications being bad, people's experiences about it here on Programmers, and an article I've ready by Joel Spolsky on the subject, but no hard evidence or case studies. Other than the two examples Joel gave and some other posts here, what do you do with a bad codebase and how do you decide what to do with it based on real studies?
For the case in point, there are two clients I know of that both have old legacy code. They keep limping along with it because as one of them found out, a rewrite was a disaster, it was expensive and didn't really work to improve the code much. That customer has some very complicated business logic as the rewriters quickly found out.
In both cases, these are mission critical applications that brings in a lot of revenue for the company. The one that attempted the rewrite felt that they would hit a brick wall if the legacy software didn't get upgraded at some point in the future. To me, that kind of risk warrants research and analysis to ensure a successful path.
Have there been actual case studies that have investigated this? I wouldn't want to attempt a major rewrite without knowing some best practices, pitfalls, and successes based on actual studies.
Aftermath: okay, after more searching, I did find three interesting articles on case studies:
- Rewrite or Reuse. They did a study on a Cobol app that was converted to Java.
- The other was on Software Reuse: Developers Experiences and Perceptions.
- Reuse or Rewrite Another study on costs of maintenance versus a rewrite.
I recently found another article on the subject: The Great Rewrite. There the author seems to hit on some of the major issues. Along with this was the idea of prototyping by using the proposed new technology stack and measuring how quick the devs picked it up. This was all as a prelude to a rewrite, which I thought was a great idea!