19

I understand that many of my consumer devices charged by USB ports can charge at higher rates than 0.5A. However, in testing, I've found they don't consume more than this.

Test Setup

  • I have a power supply that outputs between 0 and 30 volts, and 0 to 20 amps.
  • I set the voltage to 5 volts, then connected up a female USB to the wires correctly.
  • When I plug in a Samsung Galaxy S4 Mini, it pulls around 0.44 amps. It is the same with a Samsung Galaxy Tab Pro.
  • However, when I connect an iPad Air with an official lightning cable, it only pulls around 0.11 amps.
  • Also, I tried connecting a ZGPAX S28 smartwatch and it was still about 0.44-0.45 amps.

The power supply is more than capable of outputting the maximum 2.1 amps, so why doesn't it to at least the tablets?

What do I need to do in my test setup to convince the devices to consume their maximum charging current?

joshglen
  • 385
  • 1
  • 2
  • 8
  • 2
    [Read this.](http://electronics.stackexchange.com/q/34745/7036) But first ask yourself: why do you want to force-feed 20A to a tablet? – Nick Alexeev Jun 28 '15 at 19:53
  • 3
    The max charge current is negotiated by resistors in the data USB lines and limtited by the charge controller IC in the device. – pjc50 Jun 28 '15 at 20:00
  • 3
    I'm not trying to force feed 20 amps! I'm trying to feed 2.1 amps, but the tablet is not accepting it from my power supply even though it would from a real apple or samsung power supply – joshglen Jun 28 '15 at 20:18
  • 1
    In addition to the charge-rate negotiation that needs to occur, you also need wires capable of supporting it - most USB cables are 28 gauge, and depending on the length, will have a considerable voltage drop across them. – user2813274 Jun 29 '15 at 00:00
  • 1
    This question involves the use of a lab power supply, testing the current consumption of a device, and asking about the underlying reasons concerning USB power supply draw and the USB specification. This is on topic of this site, and should be re-opened. Once that's done I can add a fairly comprehensive answer that will help future electronics engineers designing USB power supplies for consumer devices, though there are many others here who could do the same and may beat me to it. The existing answers aren't bad, but there's more to it. – Adam Davis Jun 29 '15 at 12:42
  • 1
    1. Short together USB's D+ and D- and try again; 2. On my smartphones I [can directly override](http://android.stackexchange.com/questions/92413/how-do-i-override-charging-current-on-huawei-honor-6) charging current by root command. – Vi0 Jun 30 '15 at 20:30

7 Answers7

31

The reason why your Apple/Samsung devices do not draw more current is simple. It is because there is additional data communication going on between the Apple/Samsung device and their dedicated power supplies. This makes both devices recognise eachother and agree on a higher current to be used. The charger sets certain voltages on the USB data lines and this is recognised by the phone or tablet.

Your 20 Amps supply does not supply these voltages on the date lines so it does not "talk" to your Apple/Samsung device therefore these assume it is a normal "dumb" charger and do not draw more current than what is allowed by USB standard which usually is only 100 or 500 mA

To charge an apple device put thoses voltages on the data lines:

desired current : 2,000mA D- : 2.0V D+ : 2.75V

Like this circuit, for apple devices at 2A.

Also, depending on how full the battery is the charge current is also limited. It will only be maximum when the battery is 30 - 70 % charged (these numbers are just my guess). Charging with a high current is bad for the battery when it is very low or almost full.

Sources:

Adafruit : The mysteries of Apple device charging

Voltaic : Choosing USB Pin Voltages

masterleo
  • 9
  • 4
Bimpelrekkie
  • 80,139
  • 2
  • 93
  • 183
  • Little addition, here: Even if the device would talk to the supply, (which is usually done with simple config resistors across the data pins, you can google on that one, they are simple to set up) I know not a single device which would draw more than 0.5A. 1A is pretty rare, actually, so you're actually pretty much in spec, there. – Alzurana Jun 29 '15 at 00:05
  • 1
    Perhaps you meant 0.1 A instead of 1 A? I believe 0.1A is the maximum current allowed through USB (in the spec) without enumeration. – W5VO Jun 29 '15 at 02:32
  • @W5VO indeed, my number was too high, changed it. But according to wikipedia the maximum is 0.5 A. There was no mention of 0.1A but perhaps I overlooked that. – Bimpelrekkie Jun 29 '15 at 06:08
  • @Alzurana, I have a tablet that will draw 0.8A from USB but only if the datalines are shorted. I also have a smartphone that will draw more than 0.5A when the datalines are shorted. When the datalines are unconnected all draw less than 0.5A. – Bimpelrekkie Jun 29 '15 at 06:11
  • 1
    A non-enumerated device is only allowed to draw 100mA. Once enumerated on a regular port, it's limited to how much it requested, up to 500mA. Charging device ports can have higher limits over and above that. – Nick Johnson Jun 29 '15 at 06:29
  • Samsung chargers are not “intelligent”. They conform to the USB spec and short the data lines, unlike Apple chargers – kinokijuf Jun 29 '15 at 06:47
  • @kinokijuf: if [this MAX14667 datasheet is to be believed](http://datasheets.maximintegrated.com/en/ds/MAX14667.pdf) Samsung also uses resistor dividers for Galaxy 2A chargers, possibly different than Apple's. That datasheet doesn't give values though, because the aforementioned IC has the dividers built in. – Fizz Sep 24 '15 at 20:05
  • Here's a forum link for DYI 2A Samsung chargers: http://forum.xda-developers.com/showthread.php?t=1675042 It also involves putting some resistors on the data lines, as suspected. – Fizz Sep 24 '15 at 20:17
  • How do Samsung devices differentiate between 1A chargers and 2A chargers? – David Balažic Jan 11 '20 at 18:13
8

Here's a more complete article that lists the known proprietary resistor-divider D+/D- identification schemes for high-powered chargers. The gist is:

2.0V/2.0V – low power (500mA)

2.0V/2.7V – Apple iPhone (1000mA/5-watt)

2.7V/2.0V – Apple iPad (2100mA/10-watt)

2.7V/2.7V – 12-watt (2400mA, possibly used by Blackberry)

D+/D- shorted together – USB-IF BC 1.2 standard

1.2V/1.2V – Samsung devices

The Samsung values coincide with what's indicated on this schematic [original source] via 10k/33k resistor divider for 2A Galaxy tablet chargers.

Like I said in my other comments, there are also off-the-shelf chips that implement some of these, e.g. MAX14667, TPS2513, Microchip's USB2534 or CYUSB3324. The datasheet of the latter also provides some more details on Samsung devices:

Samsung devices follow multiple charging methods. Some Samsung devices (Samsung Galaxy Tablets) use a proprietary charging method in which the D+ and D- pins are biased to the same potential (~1.2 V). The Samsung Galaxy S series (S3, S4) devices follow the USB-IF BC v1.2 charging standard for DCP, CDP, and SDP mode of operations.

The USB-IF BC 1.2 has these requirements for DCP mode:

  • D+ and D– data lines are shorted together with a maximum series impedance of 200 Ω.

  • Must not cut off power supply until voltage drops below 2V or current exceeds 1.5A

  • Absolute maximum current draw allowed but not required is the limit of the USB 2.0 connector, up to 5A

Also, the Cypress note says that the 2.7V/2.7V -> 2.4A is [also] used by Apple. Discussion on TI forums on which TI employees chimed in (for their TPS2513A) indicates the same. A TI employee said:

Begin from iPad3, 42Whr battery is used, long charging time starts to become an issue. Compare charging time, from battery 0% to 100%, 2.1A charger takes 6hrs, 2.4A charger takes 5hr40mins. So 20mins quicker, but not much.

We believe the key reason Apple release 2.4A charger is for better user experience when charging and playing at same time. When play high quality video games, like Infinity II and charging at same time, my iPad3's battery percentage increase is very very slow, e.g, 30mins later, only increase 2percent which drive me crazy.

While using 2.4A charger, the battery status increase faster, at least I feel normal, ok with it.

endolith
  • 28,494
  • 23
  • 117
  • 181
Fizz
  • 14,355
  • 2
  • 43
  • 97
5

The power supply is more than capable of outputting the maximum 2.1 amps, so why doesn't it to at least the tablets?

The USB standard does not allow more than 500mA to be drawn from a standard USB 1 port. Until the device establishes communication with the USB host device it has no way to know how much current is available.

The USB standard actually requires devices to drawn no more than 100mA before communicating with the host and requesting more power. This is important because a standard unpowered USB hub will consume 500mA - 100mA for itself, and 100mA for each of its ports. This means that an unpowered hub cannot, and should not, attempt to supply 500mA to a USB device.

The standard was designed this way to support a variety of usage.

Obviously only Apple follows the standard, and consumes only 100mA prior to requesting more power.

The reality is that few USB ports are unable to supply 500mA without being asked. Many don't even bother to monitor current consumption and shut off non-compliant USB devices. It's almost always safe to draw 500mA from a USB port without asking the host port for the maximum power.

The newer USB specifications allow for higher power ports. Again, though, this must be requested to be compliant with the specification.

USB chargers are typically not intelligent, and don't implement a full USB host port. They use some short cuts - generally using resistors on the D+ and D- lines to signal the USB device that the charger is capable of more power without an official request.

Further, some devices, such as the Apple iOS line, will also monitor the voltage provided, and scale back the current consumption based on voltage drop. For instance, if a charger reports that it can supply 2A, but the voltage doesn't stay at 5V, the iOS device will consume less than the maximum current. It will not charge below 4.5V, nor above 5.5V. So not only does the charger have to present the correct signals to indicate full current is available, it has to maintain good regulation at the maximum current draw.

Keep in mind that this is a safety feature. Not only does the charging device need to be able to supply the current, but the USB cable used needs to be able to carry it. It might not seem like a lot of current, but there are many very cheap thin USB cables on the market that will noticeably warm up with 2A flowing through their undersized conductors. Put that under a flammable pillow and let the heat build up, and you might find more than melted insulation.

Apple not only verifies the charger, but also the cable (using their proprietary chips inside the cable connector) so they can avoid liability for possible losses associated with dangerous chargers and wiring.

As long as you are using the cable that came with the device, though, you should have no issue with this aspect of it, and can focus on the charger signalling.

What do I need to do in my test setup to convince the devices to consume their maximum charging current?

The Apple standard has been loosely adopted by others, or is accepted by others, and consists of placing specific voltage levels on the D- and D+ lines at a low current. Placing approximately 2.0V on the D- line, and 2.75V on the D+ line will signal 2A (10W) is available for charging. This can be done with simple resistors:

schematic

simulate this circuit – Schematic created using CircuitLab

If you follow this circuit in your setup, you should find that at least the Apple devices charge at 2 or more amps, and you may find your other devices also charge at this rate.

Adam Davis
  • 20,339
  • 7
  • 59
  • 95
3

You can try using dedicated charging port controller.

According to the spec it can detect the following charging schemes:

  1. Divider 1 DCP, required to apply 2 V and 2.7 V on the D+ and D– Lines respectively (TPS2513, TPS2514)
  2. Divider 2 DCP, required to apply 2.7 V and 2 V on the D+ and D– Lines respectively (TPS2513, TPS2514)
  3. Divider 3 DCP, required to apply 2.7 V and 2.7 V on the D+ and D- Lines respectively (TPS2513A, TPS2514A)
  4. BC1.2 DCP, required to short the D+ Line to the D– Line
  5. Chinese Telecom Standard YD/T 1591-2009 Shorted Mode, required to short the D+ Line to the D– Line
  6. 1.2 V on both D+ and D– Lines
bcelary
  • 131
  • 1
  • 1
  • 4
  • Are you sure you have the description in point 1 correct? You seem to have `2 V` twice, along with `2.7 V`. – user2943160 Jul 19 '16 at 03:10
  • 1
    @user2943160 Thanks. Copy/paste error. Fixed it. – bcelary Jul 20 '16 at 09:58
  • Point4, BC1.2 DCP requires a *resistance* between D+ and D-. This may be a short (0 Ohm resitance), but may be higher as well -- up to 200 Ohm. Not sure why this anser isn't upvoted more, because it's the best advice. – florisla Nov 08 '16 at 08:44
1

20 amperes is what the supply is capable of delivering, and whatever you connect to it takes what it needs from that 20 amperes.

For example, if you have the supply set up for a 5 volt output and you put a 5 ohm resistor across its output, it'll supply 1 ampere to the resistor because that's all the resistor wants with 5 volts across it.

You could also put twenty 5 ohm resistors across the supply, in parallel, and it'll supply 1 ampere to each of them for a total of 20 amperes.

EM Fields
  • 17,377
  • 2
  • 19
  • 23
  • I don't need 20 amps! I just want the 2.1 that it is more than capable of delivering. I am wondering what is limiting that current from my power supply, but not fro ma real apple or samsung charger. – joshglen Jun 28 '15 at 20:18
  • As others have so aptly commented, it's the data interchange between the object and its charger that allows the object to ask for lots of current in order to charge up quickly. Big however, however: When you unplug the phone from its mommy and plug it into USB, the rules all change and no matter what the phone asks for, USB will give it what it damn well pleases. – EM Fields Jun 29 '15 at 11:29
1

A USB device will only draw the amount of current that it knows it can definitely draw without causing damage to the charger. The Apple and Samsung devices have no idea how much is safe to draw from your charger, therefore they fall back on the USB Standard. USB Standard says "0.5 Ampere is safe". (If the charger identifies itself as a USB charger but cannot handle 0.5 Ampere, that's not the phone's fault but the charger's problem).

The only way to convince the device to draw more current is to implement the communication in the charger that would happen between an Apple device and an Apple charger, or a Samsung device and a Samsung charger. It's not very difficult, and you can buy decent quality chargers at reasonable prices that do this. Your charger doesn't do it, so 0.5 A is the limit.

gnasher729
  • 331
  • 1
  • 2
0

All USB devices have to behave when connected to a USB data port, which will supply 100mA by default, 500mA when "asked nicely". That's what you're getting.

If you connect the Data+ and Data- signals together through a small resistor (for protection.. something around 100 ohms), you'll be in compliance with the USB Battery Charging specification, which allows a device to draw a minimum of 1.5A.