That was me having a "senior moment"
The USB 2.0 specs are 5V/0.5A (500mA). I have slapped myself on the back of the head several times.
All computer USB ports conform to this, as do all generic (and properly designed) chargers. We'll ignore the ultra-cheap Far East knock-offs as nobody who values their smartphone should be buying one of those.
However the important figure is the voltage - that should
never exceed 5V. Amperage is secondary.
Given the above restriction, lithium-ion batteries incorporate (or should - see "ultra-cheap Far Eastern knockoffs above) sophisticated protection circuitry which prevents over-charging. The battery itself says "enough" when it's near full capacity. One
cannot overcharge a Li-Ion (or Li-Poly) battery pack... provided it's manufactured correctly.
Modern devices also limit the current draw as required. A smartphone with a near-depleted battery will draw the maximum current (amperage) provided by the charger until it reaches near-full capacity, at which time it says "enough!" and prevents further charging. If the device is powered on, this will result in gradual depletion until such time as a threshold is reached, at which time charging will re-commence to the previous point. There's no "trickle charging". A device may ask for 2.0A, but if the charger can only provide 1.0A then 1.0A is all that it will receive. Note that the device is in control, not the charger.
TLR version:
Amps don't matter, voltage does.
If its 5V or less... good. More? Bin it!
Less amps than stock = slower charging time
More amps than stock = same charging time
Real World(tm) example:
My Sony Xperia Z takes ~2hrs to charge from "low battery" (20%) to 100% with stock 1.5A charger.
It takes the same time to charge using a 2.1A car charger, as the handset itself limits the current draw to 1.5A.
It requires over 3hrs to reach 100% using the HTC 1A charger on the other bedside table.
I hope this clears up any confusion?