To my understandings, the new 12-volt only standard moves the 5V and 3.3V supplies off from the ATX power supply "box" to the motherboard. According to PC World (which is cited by Wikipedia), Gamers Nexus, and Linus Tech Tips, the new standard "improves efficiency" by not generating the 5V and 3.3V rails.
Why?
- Modern PC parts still need those rails, for example, SSDs usually use 1.5V 3.3V for main operation. We use 5V in USB devices to charge up cellphones and power personal beverage coolers. So the need is not going away (any time soon) just because Intel has published a new standard.
- Buck converters are cheap. Buck converter controller chips are tiny. Why not throwing a few more components into the ATX power supply "box" and down-convert from the high-demand 12V rails? The new standard calls for the conversion to happen on the motherboard. Isn't it just the same?
- Linear regulators commonly found on cheap adapters like M.2 to SATA generates 3.3V rails by consuming all the power from the rest of 8.7V as a space heater. Aren't these more problematic and should be eliminated first?
- I understand a part of their claims is that traditionally all rails are directly converted from the main line (110V or 220V). But you can easily design a multi-stage converter using the main just once.
To summarize: We still need the 5V and 3.3V rails. We do it either in the power supply "box" or on the motherboard. We can use buck converters instead of linear regulators anywhere anytime. Why does moving the 5V and 3.3V rails to a different place -- still doing the same thing -- improve efficiency?