1

so it was my understanding that USB was standardized with 5 volts (V) at 1 Amperage (A), but I've seen wall adaptors with 5V and different Amperage including 1A, 1.1A, 1.5A, 2A, 2.1A, and as much as 3.3A before. Some of them had extra USB ports, but not all of them did. Does the higher Amperage increase charge speed? And if so, is there a standard tolerance of devices which charge via USB? For example, is 5V@2.1A "OK" for charging most tablets or phones, but >3.0A@5V could cause damage?

In addition, if Amperage is set by Voltage and Load, don't many computers pull more depending on what is being done? I don't mean just Null-Operations (NOP) versus addressing, but whole sections of the machine might not be on / on stand-by if not needed. Would this mean the wall adaptor is rated by an average, not what it actually puts out?

user2068060
  • 113
  • 2

1 Answers1

0

Some USB devices can use more current to charge more quickly.

Some USB devices require more current than the USB 2.0 standard provides.

The standard USB 2.0 current is 0.5 A.

You won't damage your device with excess current; it will only provide as much current as the device will draw. See this question for more detail

Brian Carlton
  • 13,252
  • 5
  • 43
  • 64
  • 1
    If it only draws 1A, but the charger is 2.1A, does the rest go to heat? – user2068060 Jun 18 '13 at 19:13
  • The wall adapter is rated by how much current it is capable of providing, but it can also provide less. If the device only draws 1A, then it will only draw 1A from the wall adapter - the wall adapter won't force the device to use more power than it wants to consume. Similarly, the wall outlet you plugged the adapter into is probably capable of supplying 15 - 20 amps of current @ 120VAC, but the wall adapter likely draws less than 0.1 amp even though the outlet is capable of supplying much more current. – Johnny Jun 18 '13 at 19:37