7

We recently added a relatively large feature to our product that involved the entire R&D department (with all the different teams within it).

This feature included UI development, server side development, and huge migrations of SQL schemes (and other stuff that I myself was not involved in at all).

The development process for this feature was chaotic - Front-End and Server teams were not synchronized, SQL migrations broke the DB, and product specifications were incomplete which meant that with every step of the way we found new issues with the initial definition, requiring changes to core concepts that the developers had relied upon.

All in all, a feature that was planned to be released within 10-14 days, took roughly 24 (intensive) days of development.

I was requested to write a report of 'What went wrong' (from my team's side - each team writes such a report from their pov).

What are the common methodologies for writing such reports, specifically in the field of software development?
Also - Is there some formal name for such reports?

EDIT & ANSWER:
Apparently, such a report/review-process is commonly referred to as 'Project Post-Mortem' (other names exist as well).
After figuring this out, I found these two resources that outline suggested methodologies for gathering the necessary data, organizing it, analyzing it, and formulating solutions for discovered issues (as well as some general information on these reviews and their purposes):

'A Defined Process For Project Post Mortem' -
http://www.csee.umbc.edu/courses/undergraduate/345/spring12/mitchell/readings/aDefinedProcessForProjectPostMortemReview.pdf

'Post-Mortem Reviews: Purpose and Approaches in Software Engineering' - http://www.uio.no/studier/emner/matnat/ifi/INF5180/v10/undervisningsmateriale/reading-materials/p08/post-mortems.pdf

Tudmotu
  • 171
  • 4
  • 1
    sounds like a bigger effort than 14 days. At some point someone needed to step up and say, "this is bigger than we thought." My guess is that management made it impossible to say that. – sea-rob Mar 01 '14 at 21:29
  • Sort of. We didn't really think 14 days would be enough but our sales department promised this feature to a client who expected it within 14 days. – Tudmotu Mar 02 '14 at 09:20
  • 1
    Yes, it's obvious that the external pressure from none-R&D departments had a crucial role in the failure of the feature, but my question is aimed to be generic - That's way I didn't go into details of exactly what was the feature and what was my role in it. I assume that in the future I'll encounter similar scenarios where a feature goes wrong, and I'm looking for general advice on how it is best to approach the writing of such a report. I also view this report as a way to organize my own thoughts on the matter and perform self introspection - not only as a bureaucratic request from mgt. :) – Tudmotu Mar 02 '14 at 10:07
  • Try hard to leave your opinions out of the document. If possible, ensure all statements of fact can be backed up (you do save your emails, right?). – Dan Pichelman Mar 02 '14 at 18:31
  • `[...] but our sales department promised this feature to a client who expected it within 14 days.` - Fine. Then you should ask your sales department how **they** plan to fulfill **their** promises. It's not your promises, it's theirs, even more if you advised (and possibly can prove that point) not to promise that deadline. So they better have a plan. `We didn't really think 14 days would be enough` I hope you didn't only think that, instead also expressed that opinion clearly. Otherwise you have an great opportunity to learn something right now. – JensG Mar 03 '14 at 23:44
  • Thanks, but the point of this review is not assigning blame. :) It is possible that with proper preparations from R&D side it would have been possible to develop such a feature within 14 days. I don't know. That's what we want to find out. We would like to learn from this experience and see what are the things need changing in our workflow, so such delay won't occur (or be minimized) in the future. One of these things might be 'BD should not promise schedule', but our review shouldn't stop there. ;) – Tudmotu Mar 04 '14 at 12:44
  • I understand that. But some people learn only the hard way. And if they made the promises, they should be led to the insight, that they screwed, not the other people that have to meet their promises. Especially when the warnings have been ignored. This is a very common pattern in the industry, I have seen that too often. Of course you are absolutely right, when you say (1) the review should not stop there and (2) we should focus on the positive next time. – JensG Mar 04 '14 at 19:12

1 Answers1

4

Having done this many times, this is the format that I have used successfully in the past. The order these headings are presented in will also make a difference in how the report is received by management. Whenever possible, leave them with a good taste in their mouths.

This shouldn't have to be said, but... keep your language professional throughout the report.

What happened?

Describe what happened, both good and bad. Give details about each incident. Do not place blame. Keep opinions out of it. Imagine your self as a reporter who is telling the story of what occurred.

What went wrong?

This is the hard part. Admit to what you, your team, or other teams did wrong. Keep blame out of this! Just state the facts. "We (my team) made breaking changes to the database that caused delays for entire development group by X number of days."

What went right?

Part of the telling of this story should include where your team, or other teams, did the right thing. Communication of delays, changes, etc. "We added significantly more columns to tables X, Y, and Z. These columns will allow us to track something."

How do we improve the process?

This is where you start to display some opinions. You were asked what happened and what could be done to fix it. State your opinions supported with facts. "When we made the breaking database change, we should have implemented these changes on a separate development database."

Adam Zuckerman
  • 3,715
  • 1
  • 19
  • 27
  • Thank you very much for this answer, but I'm looking more for a defined process for writing such a report. How do I start? Do I list everything I remember? Should I group the incidents I remember by some criteria? What is the best way for exhibiting the failure of cross-team integration? Again, sorry for the mess in my initial question. Would love to hear your opinion of how to approach the process as a whole, including the process of organizing my thoughts and ideas before actually writing the report. – Tudmotu Mar 02 '14 at 10:17
  • BTW, external sources such as academic articles on the matter, blog posts, and live examples for such reports would be very helpful to me. As I said, I've never written such a report, and I find it difficult to even start. :) – Tudmotu Mar 02 '14 at 10:19