I am learning computer architecture and organization. I am stuck in the following question. Can someone please help me?
The stage delays in a 5-stage pipeline are 300, 200, 100, 400 and 350 picoseconds. The second and third stages are merged into a single stage with delay 350 picoseconds. The throughput increase/decrease by ………… (percent).
This is what I tried:
- Throughput of 1st case T1: 1/max delay =1/200
- Throughput of 2nd case T2: 1/max delay= 1/350
- %age increase/decrease in throughput: (T2-T1)/T1 * 100
= (1/350 - 1/200)/(1/200)*100
= -42.8
So the throughput decreases by 42.8%.
But the correct answer is given to be 0. I don't understand why?