-3

I have bought a Sound Bomb from Maplin. On the bottom of the device it says:

Voltage (DC):  12v
Voltage Range: 6-15v
Current:       85mA

If I want to wire this up and operate it from say, a 9V battery, would I need to put any kind of resistor in the circuit? Also is there a general rule for safely using components without breaking anything?

Shaun Wild
  • 105
  • 6

2 Answers2

2

No need for a resistor. The general rules for these kinds of things are:

  • Don't exceed the rated voltage (in your case 12 V DC - or 15 V DC, it is not entirely clear to me).
  • Use a power supply that can provide at least the rated current (in your case 85 mA).

Edit: If this is the device in question, it seems it needs 12 V in. Whether it works with 9 V or not, the PDF does not say.

Dampmaskin
  • 3,787
  • 1
  • 16
  • 29
  • If I have a 1A 12V power supply and a 12V 85mA device, is the power supply going to fry the device? – Shaun Wild May 13 '16 at 12:25
  • 85 mA is a very small load for a 1A power supply. If the power supply uses a transformer, you should be OK. If it's a PWM power supply, I guess it will depend on the quality of the power supply. – Dampmaskin May 13 '16 at 12:27
  • I'm quite ignorant in the field of electronics but I'm getting an understanding of how things work now. So does the current rating on a power supply say "can produce" or "will produce?" or does it depend on something else? – Shaun Wild May 13 '16 at 12:30
  • @ShaunWild: [Look here for power supply selection info.](http://electronics.stackexchange.com/questions/34745/choosing-power-supply-how-to-get-the-voltage-and-current-ratings) – JRE May 13 '16 at 12:30
  • 3
    @ShaunWild: The current rating of a power supply means the maximum current that the power supply "can produce". The voltage rating means the voltage that the power supply "will produce" as long as the device being powered doesn't exceed the current rating. In short: The power supply decides the voltage. The device being powered decides the current. – Dampmaskin May 13 '16 at 12:34
  • @JRE that was an extremely useful answer. Thank you very much for linking it! – Shaun Wild May 13 '16 at 12:39
  • 2
    @ShaunWild. That's sort of the point of this stackexchange - home for the answers people have found useful and which will continue to be useful. – JRE May 13 '16 at 12:40
1

The problem "power supply vs powered device" can be simplified this way:

The Rated voltage is the problem for the device, the device must be able to withstand voltage, so power supply must not exceed it. If exceeded, device can fail (manufacturer do not guaranties correct working). Also the power supply voltage should not go below minimum rated voltage. This usually do not fry the device, but manufacturer do not guaranties correct working of the device on lower voltages.

The Rated current is the problem for the power supply. If the current load of power supply is exceeded, power supply can fail. The device drains current from power supply, so power supply must be able to provide that much current. Power supply should have at least current rating the same as of the device. If the ratings are the same power supply will operate at 100% load. So it is better to have higher current rated power supply so it can operate on lower load (to have longer life).

In your case, answer is NO, you do not need resistor.

Darko
  • 879
  • 6
  • 12