I keep hearing CISC architectures consume more power than RISC architectures. This is said to be the reason for using RISC architectures for low-power applications. I am a skeptic, I think it could be possible that the low-power consumption of RISC is just a confirmation bias after seeing examples of low-power RISC architectures.
I am not sure if there is a scientific basis to this. The reason could also be the availability / low-cost of licensing of RISC architectures for embedded applications. The reason that is usually given for RISC consuming lesser power is said to be a lesser complex decoder unit compared to CISC.
Is this true? Do RISC architectures consume lesser power than CISC architectures? Or does CISC vs. RISC really not matter for power consumption?