I want to build a simple V-I tester and based on this article (or this) I did this easy circuit:
R1 = 560Ω
R2 = 100Ω
R3 = 1kΩ
where R1 and R2 is a voltage divider to set the \$V_{(DUT+V_{R_3})}=1VAC\$ and R3 limit the current in the Device Under Test.
I used this transformer, with 6.3V, 10 VA output (is dual, but I use only one output).
My question:
- If I change the voltage divider resistors to \$R_1=56Ω\$ and \$R_2= 10Ω\$ (especially), isn't better? With 10Ω I will reduce the output impedance of the circuit that feed the DUT, thus I don't load the source because it is in parallel to the \$R_3+R_{DUT}\ge1000Ω\$ and then \$R_3+R_{DUT}\gg R_1\$, isn't?
BTW the current I draw is \$I =\frac{6.3}{66} \approx 100mA\$ and the output power is at the "safe" value of: \$6.3V\cdot100mA = 0.63VA\$ - With this voltage divider I have a voltage less than 1VAC across the DUT. The article says that I can test also Diode, Zener, etc. How is possible with this small voltage?