so it was my understanding that USB was standardized with 5 volts (V) at 1 Amperage (A), but I've seen wall adaptors with 5V and different Amperage including 1A, 1.1A, 1.5A, 2A, 2.1A, and as much as 3.3A before. Some of them had extra USB ports, but not all of them did. Does the higher Amperage increase charge speed? And if so, is there a standard tolerance of devices which charge via USB? For example, is 5V@2.1A "OK" for charging most tablets or phones, but >3.0A@5V could cause damage?
In addition, if Amperage is set by Voltage and Load, don't many computers pull more depending on what is being done? I don't mean just Null-Operations (NOP) versus addressing, but whole sections of the machine might not be on / on stand-by if not needed. Would this mean the wall adaptor is rated by an average, not what it actually puts out?