So, I just blew the fuse in my 17" monitor using a universal adapter with the settings
Input: 100-240V 1.5A(1.5A) 50/60hz
Output: 15/16/18.5/19.5/20/22/24V (max of 70w)
I want a clarification of sorts on exactly how to compute Wattage on a device such as a monitor using a universal adapter with the above setting\selection. On the back of the monitor, I have rating 12V 4.16A.
What I typically do to calculate power is to multiply the Voltage and Current numbers
**12 * 4.16 => 49.92W**
Now to set the Adapter for the monitor, I computed that by setting the adapter to 22W and assuming the amperage to be 1.5A as written on the back of the adapter, I could get about
**I computed that 24V x 1.5 A => 36Watts**
Given that it is lower than what I computed as the wattage needed for the monitor, I decided to try it anyways. But, as I said, I heard a pop sound and some smoke came out of my monitor. I would appreciate some help in understanding my error here so I can be more careful next time when using a Universal adapter such as the one I used