Take a look at the USB charging specification.
There are numerous ways manufacturers today implement negotiation of USB currents. The standard USB way is, as was mentioned, requesting the current by message exchange with the USB host controller.
However, many devices don't bother with the complex USB protocol and just draw as much current as they require - until the supply voltage drops.
That said, I guess that it's kind of safe-ish to ignore the USB specs without ruining (too many) PCs. Usually, the output voltage will just drop when too much current is drawn until it balances out. This can of course affect other devices connected to the same controller/hub/..., which will witness the unanticipated voltage drop.
Yet, one will have to consider worst-case scenarios; like when the user uses a cheap USB hub, plugging, say, four of your devices at the same time. Something like this is not unlikely to destroy the hub, its power supply, or both, and thus yields a risk of fire or other damage. Keep that in mind.
Recently, I discovered that one of my Android devices has an interesting detection mechanism for its high power charger:
When idle, the charger will output something like 5.05V, just as any other USB port, but once the device is connected, the charger will raise the voltage to about 5.2V, which is still within USB voltage specs, and which lets the device know that it may draw the higher "fast charge" current (~1.2A).
When connected to a normal USB port, the voltage will be at or below ~5V and hence the device continues to draw "only" 440mA.
Maybe you can implement a similar technique.