I've been reading a lot about the topic USB/FireWire recently and stumbled upon the fact that, despite USB 2.0 having a rate of 480MBit/s, it often does not reach the actual transfer rate of 60MB/s. Often there is some sort of overhead mentioned, but I really don't understand what it means and there doesn't seem to be any sort of comprehendable explanation for someone who's not really tech-savvy like me.
I've also read that FireWire has this overhead too, but it doesn't seem to be as severe as for USB. Why is that?
EDIT: I guess I'm talking about MBit. Also, so you could say it's because of extra data being sent too, like these overhead bits for error checking, that my data that I want to transfer, for example, is being sent far below the theoretical rate?