First of all, an old cable has VBUS, GND , D+ and D-. This is only four wires. The full 100W power draw is only possible when using USB 3.0 compatible cables, which have 8 wires in them (two extra data pairs). I have not read the full USB spec in detail for a few years, but I know that the negotiation of power delivery can get quite convoluted, but for it to work both sides of the cable make sure everything is correct, including the cable type (I’m guessing that this is done with some negotiation using the extra conductors).
Another thing to note is that the high transfer of power can cause other issues with data transfer (there are notes in the wiki page you’ve linked to this effect). So you need a high quality cable to keep the noise level down far enough for both ends to be happy to keep the power high. Old cables are unlikely to have been made at the quality level required (though it is possible the stars aligned, and you had the best cable ever made in the 2000s).
Couple those two facts with safety margins: an old USB cable was designed for 100mA at 5V, but the will have had a huge safety margin (multiple orders of magnitude). Even back in the old days of dial up internet there were safety requirements on things like cables: self-extinguishing materials for insulation etc. So even if they got hot, they might start melting which would be unpleasant, but not quite burst in to flames event.
As soon as anything starts going wrong with the cable, the USB handshaking will go wrong, and the power supply will very quickly shut the supply of power down. Everything fails safe is the idea.
Pushing high current through small wires causes voltage drop. This will be more than expected with a poor cable, this will be detected by the device being powered, which will then alert the power supply, which will scale everything down again to safe levels.