I'm aware that my phone might only demand max 1A. Let's suppose it does. Still, the following is strange to me-
How is it possible, with 2 "20 AWG" cables each capable of 2 amps, for one to give 0.4A to my phone and the other 0.7-0.9A?
I have two USB cables, from different companies. Both are 20 AWG.
I tested them with my 2A capable charger and a usb load tester showing volts and amps, that lets me adjust amps it demands, and both cables are capable of 2A.
Putting the load tester aside, using my phone instead, so I then take my 2A capable charger, USB volt meter and phone..
I plug one cable in on my phone and my USB volt amp meter measures 0.4A
Then I change the cable.
So I plug the other cable in on my phone and the USB volt amp meter measures 0.7A
And i've tested charging my phone in two locations, (one location has one charger and usb monitor, the other location has another charger and usb volt amp monitor), so i've tested it with two different USB volt/amp meters and two different chargers. And i've seen my phone charge much faster with the one that shows 0.7A, so I believe the reading.
How is this possible.. What could it be about the cables that causes this, And what device could measure it and show it?
I've already looked when buying cables to make sure they're 20 AWG(not 28 AWG), and i've got the load tester.. i'm happy to get other devices to get to the bottom of this. What else could it be and what other devices might I need to test that that is the case?
And the cable that shows 0.4A is actually shorter than the one that shows 0.7A. So it's not a length issue.