I'm new to hardware and am coming from a software background. I have an SoC that will be powered by a supply generating 5V and 1A. I'm trying to figure out what are the safe ranges of volts/amps/ohms that I can use for various components that I'll be connecting to this SoC. For instance, say I want to add a transceiver to the SoC so that it can communicate with some other device:
- Does the transceiver need to be rated above or below or exactly at 5V?
- Does the transceiver need to be rated above or below or exactly at 1A?
- Any resistance/ohm regulations/ranges that I need to consider?
I guess I'm trying to learn how to make a decision like the following:
My power supply is 5V and 1A. I see that
Transceiver A
is rated for 3V and 500mA. Perhaps its fine that its only rated for 500mA and that it will be receiving 1A, but perhaps it's not fine that its only rated for 3V but it will be receiving 5V. Hence, I would not want to use this particular transceiver.
Even if the logic in the blurb above is wrong (from an electrical standpoint), it underscores what I'm trying to understand here: given a known power supply, how do component ratings (for volts, amps, ohms) affect my decision of whether to use them or not?