1

UPDATED (more clarification) Essentially what I am trying to do could be called "SuperCap Based UPS". In other words, to replace the battery in a DC to AC Inverter circuit with a Super Capacitor to provide temporary 120V AC power to an appliance that draws about 0.08 amps or about 10 watts @ 120VAC.

Will I get more runtime by charging up a large 1 Farad Cap at 12V or would it be better to store 120V in a Cap that is capable of this voltage, and instead have the inverter convert from 120VDC to 120VAC?

I'm trying to provide as much run time to the end device (which obv. uses 120VAC).

So what's the most efficient way to get the most runtime at 120VAC from a capacitor powered inverter solution?

LCS
  • 15
  • 7
  • 1
    Wait... You are trying to backup AC with a capacitor? Not going to happen. You need an inverter. – R Drast Nov 10 '15 at 16:31
  • Also 10000ms = 10s not 1s. – Fizz Nov 10 '15 at 16:33
  • I should have mentioned that I am trying to do is avoid the use of a battery and instead place some supercaps into an inverter circuit. I know that there are SuperCap based UPS solutions out there. For example: http://www.electroschematics.com/7032/12v-to-120v-voltage-inverter/ – LCS Nov 10 '15 at 17:10
  • I think it will be much easier to design the inverter if you have a high voltage repository available. I don't really work with these voltage or power levels, normally, so that is all I will say at the moment. – user57037 Nov 10 '15 at 17:37
  • Why do you want to use a cap instead of a battery? – jippie Nov 10 '15 at 19:06
  • The solution is intending to provide power for a few seconds at most, and adding a battery means adding unnecessary bulk as well as complexity in the form of a battery charging circuit. While it is understood that batteries have way more power storage density than caps, they are also more expensive. – LCS Nov 10 '15 at 19:24

3 Answers3

1

Let's say your "inverter" works just like a battery backed up UPS, and that it has a pretty wide margin on its storage voltage, going from 14V down to 7V before it taps out.

That means your main capacitor in case of the low voltage option can only drop 7V.

The voltage drop across a capacitor is an integral over time of the current signal taken from it, which becomes a very nice fiddly bit of magic concerning systems of differential equations once you add in a 10W constant load and a variable converter efficiency, so I'm going to roughly ball-park it, because I'm lazy and it's evening and I have a million things to do.

(( Ref: Wikipedia page jumped to the spot where the voltage current relation of a capacitor is given ))

How will I do this?

  1. Coursely take 80% efficiency for a converter going from 12VDC to 120VAC (which may be seriously overestimating it in a DIY scenario, to be honest).
  2. Estimate the current draw to be constant, calculated at a capacitor voltage of 9.5V, rather than the exact average, which voltage I drew from my large hat of "that'll probably do". If you want to do other use cases, you can take the average, since a constant current assumption will be off any ways.
  3. Simplify the integral for constant current, which then becomes a simple linear equation: V = (I*t) / C.

So, the current from the capacitor will be:

I = (10W / 0.8 [=efficiency]) / 9.5V =~ 1.32A

Which then can be put into the simplified linear equation for the assumption of constant current (be aware, this is a very broad and lazy assumption):

V =~ (1.32A * t) / C

Let's say you want only ten seconds of power, with the known voltage drop of 7V across the capacitor, that becomes:

7V ~= (1.32A * 10s) / C

Which becomes:

C =~ 13.2As / 7V =~ 1.88F


Let's quickly do that for 120VDC as well:

Same assumptions, but the voltage range will be 80V to 120V, probably, so a drop of 40V is allowable, estimating the constance of current at the 90V point:

I = 10W / 0.8 / 90V =~ 139mA

with t=10s:

40V =~ 1.39As / C

C =~ 1.39As / 40V =~ 35mF --> Charged up to 120V = very, very lethal.

So, you see, I've already used a lot of assumptions about all the stuff you're not giving us about your project, and how you will personally be able to complete the electronics and it's still a lot of calculation, even though I made a very bad and broad assumption of constant current

The final choice will depend on fixing all the parameters and some will intertwine. There's no solution to that and that's what makes electronics design a difficult field.

This is just your very first, very broad ball park. But to be honest, re: "very, very lethal", if you are asking this question I don't really think you should be considering anything above 30VDC to store energy the likes of this.

Asmyldof
  • 18,299
  • 2
  • 33
  • 53
  • Brilliant, thank you that is quite instructive! A 1.88F cap is too large in terms of physical dimensions for my project. The question then becomes, what voltage range (rated) cap is best for the most efficient inverter design to go to 120VAC? Is it 120V cap, or is it something else, since 120VAC is RMS measurement but peak is 170V. – LCS Nov 10 '15 at 19:51
  • I asked this same question in my comment above in a different way [here](http://electronics.stackexchange.com/questions/200186/inverter-design-what-dc-input-voltage-for-most-efficient-dc-to-120vac-inverter) – LCS Nov 11 '15 at 15:45
0

As others have said, a capacitor alone will not work. What may work, is a ferro-resonant transformer or other AC-voltage stabilization device. "Ferro's" are nice because they are simple, however they can be rather lossy in terms of power.

Another option is an AC motor/generator which normally runs at no-load with a flywheel attached (converting electrical energy into rotational energy.) When the power supply is removed, the rotational mass generates similar AC for a short period.

A third solution would be a battery-backed UPS. A very small one one would suffice for 10W.

rdtsc
  • 15,913
  • 4
  • 30
  • 67
  • thanks for your feedback, the use of a flywheel to store energy is certainly a creative approach. A battery backed ups is great but I am trying to make my own solution that is cap based to keep the size very small and integrate the capability into an existing circuit. Thanks again for your help. – LCS Nov 10 '15 at 17:46
  • 1
    If you are trying to integrate into an existing circuit, the 120VAC input is probably the wrong place to be applying this backup. Design it for the actual load, instead. – Chris Stratton Nov 10 '15 at 18:03
0

You're going to have a couple problems with this approach. The biggest is that your inverter stage can only run down to a certain minimum input voltage. It or the capacitor will also have a maximum possible voltage it can run at. That voltage range, along with the capacitance, will define your available energy.

\$\frac{1}{2}C(V_{max}^2 - V_{min}^2) \$

Throw in maybe 10% efficiency losses, and that available energy divided by your power requirements should give you your possible run time.

Stephen Collings
  • 17,373
  • 17
  • 92
  • 180
  • Thank you, how does the formula you provided relate to the formula provided in [this solution](http://electronics.stackexchange.com/a/18696/91381) which I found recently. Specifically in regards to exponential discharge (and decreased current draw at decreasing voltage). – LCS Nov 10 '15 at 17:48
  • @user3416730 Exponential decay only applies to capacitors with resistive loads. You have a constant power load, so your current draw will increase at reduced voltage. The first half of that answer is more applicable to you than the second half. – Stephen Collings Nov 10 '15 at 19:13