1

I'm designing a heater to run off of a 12V battery at a field weather station. As I have no formal electrical training, I'm having a challenge coming up with an effective power budget. Note: This is similar to other questions posed on this Stack Exchange, but I haven't found anything that's quite answered my question. I also know that I could measure these once my circuit's assembled, but I'm hoping to avoid the cost of buying the components if I can get help deducing the answer beforehand.

Heat will be supplied from a resistive element designed to keep household pipes thawed. This element is designed to run off mains power, i.e. 120 VAC, and consumes 7 W/ft under this configuration. By my calculations, this means (7 W / 120 V =) 58.3 mA/ft of current. There should be no problem switching to DC input according to the best answer on this thread, but when I switch to battery power, I don't know how power/current draw will change. I envision 2 cases:

Case 1: Element draws same power at lower voltage, requiring 10X higher current.

7 W / 12 V = 583 mA draw

Case 2: Element draws same current at lower voltage, giving 1/10 power output.

12 V * 58.3 mA = 0.7 W

I need to know how this will behave so that I can budget battery power effectively. If the heater draws the same power at a higher current cost, my batteries will need more powerful/frequent recharging. If the heater draws lower power, my batteries will survive longer, but I'm assuming that this will come at the expense of poorer heating ability.

UPDATE SEPT 27: In searching for a 12V DC heat tape, I found the following link: http://www.oemheaters.com/t-dc-powered.aspx, which definitely steps through this discussion well, and provides a formula for the new power consumption of a 120V device running on 12V. The formula is as follows:

actual P = rated P * (applied V^2) / (rated V^2)

When you plug in my numbers:

actual P = 7W * (12V^2) / (120V^2) = 0.07W or 1/100 the power, as answered below...

  • Do you already have the 120V heating element? It looks like there are some 12V ones available elsewhere. – Justin Sep 23 '16 at 17:20
  • Hi Justin, thanks for the advice! I'll take a look on Google... didn't see any at "Home Depot", but I can return the 120 V version and order a 12 V. As an aside, one of the main issues that I have with these systems in general is that they are built to automatically switch on/off when the temperature is below 0C, but for my application, I need a more sophisticated trigger, so need to build my own sensor/relay. I need to see if I can find JUST the resistive wire, and I guess it must be rated for 12 V. How hard can that be? – Keegan Smith Sep 23 '16 at 18:10

2 Answers2

5

Actually, it will draw 1/10 of the current at 1/10 voltage, producing 1/100 (1%) of the original power and heat!

This is because of Ohm's law: voltage = current * resistance (E=IR). Using algebra, I=E/R. Because the voltage, E is 1/10 while resistance, R is constant, I is 1/10.

Power = voltage * current. Because both voltage and current are 1/10 of the original value, power is 1/100 of the original value.

Your heater will produce 1% of the heat at 12V. Sorry.

DoxyLover
  • 7,876
  • 1
  • 18
  • 24
2

When running from 120V RMS (AC) if it consumes 7 watts then the current taken is 58.3 mA and this implies the element has a resistance of 120/0.0583 = 2058 ohms.

If you attached this to 12V the current flow would be 12/2058 = 5.83 mA and the power taken would be 0.07 watts.

No, running it from a lower supply voltage will not nearly produce the correct power. One slight salvation might be that at a lower power delivery the resistance might be significantly less and this might mean a couple of hundred milliwatts but still miles from 7 watts.

Andy aka
  • 434,556
  • 28
  • 351
  • 777
  • This answer and @DoxyLover's were both great, thank you. I now understand - resistance is the variable that is unchanged, so that reducing your voltage will also reduce your current, thus reducing your power. I'm thinking that Justin's solution (buy a 12V heater) makes the most sense, but if I have trouble sourcing one that I can adapt to my purposes, then I suppose that I could also run the heater off of an inverter. – Keegan Smith Sep 23 '16 at 18:20
  • When it comes to heating elements the cold resistance is usually much lower than the hot resistance so in fact resistance does change as I tried to point out in my answer. – Andy aka Sep 23 '16 at 18:23
  • Thanks for clarifying this, Andy. If I understand you correctly, you're saying that, because resistance increases with conductor temperature, and because of the reduced heating at lower power, the wire's resistance will not increase as much at 12V as it would have at 120V, giving me better efficiency? But *if holding temperature constant*, when changing the voltage, the current must change by the same factor. And then, as a follow-on, because P=(I^2)R, the power is reduced dramatically. – Keegan Smith Sep 23 '16 at 18:47
  • The temperature won't hold constant because the element can't heat as much as 7 watts but, the 70 mW assumes the resistance is constant but, at lower temperatures the resistance is likely to reduce thus taking a few hundred mW. – Andy aka Sep 23 '16 at 19:41