first of all I'm a hobbyist so it's ok to roast me provided it comes with good insights!
I am prototyping an electronic apparatus comprised of different parts
- a microcontroller running on 5V (actually 3.3 but it has its own voltage regulator 5V->3v3, negligible power draw);
- a strip of 150 RGB 5V LEDs. Maximum current draw per LED at max white brightness appears to be around 60mA. Total is then 150 * 60 = 9Amps ==> 45Watt
- a class D amplifier, 40W maximum at 12V, 20+20W (so I guess, maximum 3A roughly)
At the moment it's all a mess of 3 different power supplies and I wanted to simplify and streamline it.
I have a 12V10A (120W) DC supply, whose output would be split between the amp at 12V and a 300W buck converter to create the 5V.
I am trying to figure out how much current can I actually safely draw from the PSU, as I don't really know how the buck converter would behave in such a circumstance or how to approach it.
if I only had 12V loads, the answer would be easy - I can draw maximum 10Amps. But I have 12V loads and 5V loads and if I sum the maximum power and current requirements of every single component, I am well below the 120W BUT I go above those 10 amps.
What is the correct way of looking at it to calculate the maximum current draw / maximum current available to the amps and the LED strip?
Power? i.e. as long as the combined power draw of the amp and buck converter powering the strip and MCU is under 120W, it won't burn my house down even though the sum of the currents is above the maximum draw for the psu because the buck converter converts the power into lower voltage THUS higher available amps, even above the 10A by itself?
or Current? i.e. I need to make sure that the combined current draw of all components does not exceed the PSU's 10 amps, no matter the voltage?
Thanks to anyone who makes sure I don't burn a hole in the floor with mad empirical experimentation.