I've started building some homemade computer kits recently (like the Maximite and the Micromite Companion boards), and they have barrel jacks that accept a DC power supply that are easy to find, but come in a lots of varieties in terms of output voltage and output current.
One computer I built recently recommended a 7.5 V power supply (though anything from 5V to 9V was deemed acceptable) and an output current of 1 Amp (or at the very least 500 mA).
I have a ton of old DC power supplies that can plug into these systems, but none of them match exactly the designer's recommendations, and I'm not sure what's "safe" to use and what could potential damage the system or make it unreliable.
My main questions that I'm hoping to get help with here:
1. If a DC power supply is rated 7.5 V, what difference does it make how much current it can deliver? I understand each component in the system draws some current from the power supply, and that's why the current is clearly labelled on the plug, but then what difference does it make if 1 amps comes from a 5 volt supply or a 12 V supply?
- Is there any danger in plugging in a 12 V power supply that is rated at 2.5 amps into a system that is expecting a 7.5 V supply with 1 amps?
Clearly I'm a newb on this subject, so if my questions don't make sense let me know and I'll try to clarify.
Please note: I'm not asking for a recommended power supply for a specific project. What I'm really after is just some basic understanding in how voltage and current relate in a DC circuit. I've looked extensively online and of course there's a ton of information on this, but I'm still left confused on the topic and was just looking for some help here on this site.