I'm just going to expand on what was said here.
Phones can have either a hardware limitation, software limitation, or both when it comes to charging. A perfect example of this is the Galaxy S2 GT-I9100 model. It comes with a Samsung charger rated at 0.7A (700mAH). The kernel is limited to a charging rate of 600mAH, just better than PC USB. So, a few of us dicked around with the kernel, and by removing the limitation, we find that the S2 also has a hardware limitation of 650mAH, meaning that we could barely speed it up.
Now, the S4 has a similar, but more complex issue. It ships with a 2A (2,000mAH) charger, but it has never been observed to charge at that rate. If you use the cable that it shipped with, and the charger that it shipped with, it will charge at 1.3A. If you use another other charger and/or cable (other than replacement OEM), it will charge at 500mAH. It may charge at a rate in between, but it depends on what it detects from the charger and the cable.
The reason for this, without getting too technical, is that the MicroUSB uses a standard where two contacts are shorted to denote AC. So, the phone looks for this short to determine if it charges via USB (500mAH), or AC (max allowed by hardware/kernel). Apple uses a different way to detect this, meaning that an iPhone will charge at 500mAH on ANY non-Apple charger. Samsung has added a third way, and it's more complex than Apple's standard (though it still works with standard USB/MicroUSB). As a result, you need perfect conditions for a Galaxy S4 to suck 1.3A from that 2A charger.
Long story short, using a higher amperage charger may or may not charge your phone any faster. The best answer is to use OEM equipment. Using generic equipment may actually charge your phone slower despite what it's rated at, especially if you're using an Apple or Samsung device. And if your OEM didn't place a limitation on charging speed? You can very well damage or shorten the service life of your battery.
EDIT: Repeatable test (partial source)