2amp usb chargers?


  1. peachpuff

    peachpuff Well-Known Member

    Has anyone used 2amp chargers on the captivate? These are normally found for the ipad since it needs more juice to charge.
    The stock charger is rated at .7a while the usb port i think gives out only .5a, any harm in feeding the captivate 2amps?

    Advertisement
  2. sremick

    sremick Well-Known Member

    You don't "feed" a device amperage. It draws whatever it needs, up to the maximum capabilities of the adapter/charger.

    In other words, if the device can only draw 1A, and you have a 2A charger... only 1A will be flowing. If you have a 700mA charger, then 700mA will be flowing and it'll just charge slower.

    Now, as far as what the maximum charge rate of the Captivate is... I'm not sure.
  3. ShannonPricePhoto

    ShannonPricePhoto Active Member

    not true....complete opposite...

    the only way batts charge is by amperage...the charger dictates how much current it puts to the device... more is not always better, there is a limit to the batt and the delicate device as well.
    If you have a 2A charger its gonna cram that 2A's into that batt... I suppose there may be a current limiter in line somewhere...to safe guard some devices... but the ONLY time a device draws anything is when its using the batt as a source only.

    I think USB is .5A if i rem... not very fast.. 1A is nice... made a HUGE diff in charge time on my old iphone vs usb from computer. But slow charging should eek out more run time as it saturates the cell better. seemed to be true even with the LiION batts.

    the way chargers get more amperage is a higher voltage... ( the bigger the diff in voltage, the more amperage the batt gets... ) batt voltage is 3.7 on these.... so lets say usb is 5v... that 1A is 6v and the 2A is 7volts... ( im guessing here...) it is possible to cause damage and burn circuits in the phone... now if you had a charging dock and the batt only was charging then maybe.



    the more amps the faster it charges...

    now if a batt is rated at a high mah...then the phone will only draw what it needs, irregardless of what the bat can potentially "put out"
  4. Andrewdroid

    Andrewdroid Well-Known Member

    Please give a link to what you are talking about. I'm curious.
  5. sremick

    sremick Well-Known Member

    I'm sorry, but this is not correct. As someone who's studied electronics courses (myself or anyone else) can tell you. A little research on the internet will also explain it to you.

    Volts are the "force" ("push") of electricity, while amps (current) is the flow. It is not true that "the only way batts charge is by amperage" as voltage plays a big role here too. And no, the charger doesn't "dictates how much current it puts to the device" beyond the fact that it has a maximum. The resulting current is a combination of all factors in the electrical circuit/loop. A device that only needs 500mA to charge is only going to draw 500mA from a charger, regardless if that charger is 700mA, 1A, or 2A.

    There is absolutely no harm in using a charger that can handle a higher current with a device that won't draw all of it. Voltage, however, is another matter... that is "push" from the electrical source and it can cause damage.

    Absolutely incorrect.

    I'm sorry, but you're demonstrating a huge lack of knowledge about electricity here.

    Again, you show that you really don't understand the the technology. "mAh" is entirely different than "mA". "mAh" is "milli-amp hours". It's the amount of electric charge transferred by steady current in one hour. It's a unit of electrical charge. "mA" is just "milli-amps" which is a unit of current.

    Basically, a charger will "push" a certain number of volts and can handle a certain number of amps.

    Voltage and Current
    How voltage, current, and resistance relate : OHM's LAW
    HowStuffWorks "What are amps, watts, volts and ohms?"
    Understanding Volts, Amps, and Ohms in Physics: Units of Electric Current, Resistance, and Potential Difference
  6. mnemonicj

    mnemonicj Well-Known Member

    I have to agree with sremick, connecting a charger that has a higher maximum current draw to a device will not harm the device. The device will charge the battery with what current it needs or the maximum, if the devices can draw more than the maximum.

    Connecting a charger with a higher output voltage could cause damage to the device unless the device has a way of lowering the voltage, like laptop power supplies, televisions, or other electronics that can operate on outlets in the US and some European countries.
    sremick likes this.
  7. ShannonPricePhoto

    ShannonPricePhoto Active Member

    lol that is exactly what i said....

    amperage is the difference as i stated.... these batts are 3.7 volts... charging at 3.8 volts is very slow... as the diff of voltage is almost nothing...

    the only way they will charge faster is a HIGHER voltage... thus will be a HIGHER diff of voltage....= more amps..

    so your gonna tell me, I have a 20A charger sitting here...hook up a batt pack that the bats are only going to draw what they need??? I dont think so. batteries are dumb, they do what they are told. If they cant handle the current, then they will pop...

    lets look at rechargeable AA's there are 14 hour, 8hr, 2hr, 30min, 15, min.. all the same damn dumb battery right? 2400mah... the difference between the charge times is the voltage, the shorter the charge time, the higher the voltage, the higher the voltage= higher Diff of voltage = higher Amps.

    ok i admit when i said diff of potential... that is a voltage only equation... my bad. been a while..

    I have been racing R/C professionally for 10+ years, went to college for Electronics, and was a nextel technician repairing the boards on them and sprint.

    In rc, bats are huge, in electric racing, we would cram 10A in the packs versus 6A.. we got to choose, the battery didnt tell me NO!... Its dumb remember? the charger would put out a higher voltage on the leads to create the higher Amperage...

    Like I said before there was a big difference on my iphone between using USB and The stock 1A wall plug.. but your telling me It dont matter... maybe you need to go back to school...
  8. sremick

    sremick Well-Known Member

    You may have been doing RC for a while, but unfortunately your understanding of electricity is severely flawed.

    As I'm not interested in getting into an "is not!" "Is to!" debate with you since you seem unwilling to follow even the few links I gave you, I'll leave it there. Another person has already agreed with me that your understanding is wrong... we'll see if anyone else has the energy to also tell you the same thing despite your refusal to listen.

    Cheers...
    drmdmcgwn64 likes this.
  9. james27007

    james27007 Look into my Eye VIP Member

    Some more information for those reading and want to learn more. This reply is not a responce to ANY of the previous posts except the original post, so no responce is needed. Just some additional information from a Engineer with master's degrees in Electrical and Mechinical Engineering.

  10. Slug

    Slug Check six! VIP Member

    Now that the basic principles of electronic engineering have been thrashed out, can anyone actually answer the OP with practical advice rather than theory? ;)
    james27007 likes this.
  11. ShannonPricePhoto

    ShannonPricePhoto Active Member

    i actually would like a good car charger one.... found out today, that if my screen is on, and just the screen not using gps or anything... an hour trip it only went up 1% on charge. the other day i was using gps and iheart radio and it drained the phone on a two hour trip while plugged in.

    I "thought" it was a 1A since it was for the iphone (griffin) ( and did great with the iphone and gps )
    leaving the screen on at home, stock charger, the %'s go up pretty good...
    have a link to that 2a you found?
  12. zion168

    zion168 Active Member

  13. NorCal Einstein

    NorCal Einstein Well-Known Member

    Well while the cigarette plug to USB adapter has the 2A capabilities, plugging in a standard USB to MicroUSB cable is probably not going to get you there. You're going to need to use a charge-only USB cable.
  14. ShannonPricePhoto

    ShannonPricePhoto Active Member

    You might be onto something...
    been testing diff chargers...
    so far the STOCK one seem BY FAR to charge the fastest.
    most 2A chargers are dual, and it means 1A for each slot. fail.
    using my ipad charger is even not the fastest.( you would think that would be the fastest )

    now all but the stock charger & one from wal-mart do not ask me how i want to connect to USB, mass storage, Kies,....

    I think the phone will kick in a current limiter if it senses USB a limits it to .5A
  15. NorCal Einstein

    NorCal Einstein Well-Known Member

  16. rubiconjp

    rubiconjp Well-Known Member

    Yes I did try several times, using the Apple charger from my wife's iPad. The Captivate acts weird while under charge from the iPad charger. Things hang, very jittery operations. Even the unlock screen was acting up, did not allow for me to swipe my pattern several times.

    The phone will charge while on, but acts weird.
  17. plnelson

    plnelson Well-Known Member

    And I'll add my voice to that. It's the LOAD that determines how much current is drawn, not the source. The source only sets an upper limit on it when it reaches the maximum current it can provide. So, for example, if your cellphone draws 450 ma while it's charging then it will get 450 ma whether it's connected to a 1A source or a 2A source.

    The more interesting question is this: if I connect my tablet, which requires a 2A source, to a 1A charger, will I overheat or damage the charger? I'm an electrical engineer, and depending on how they designed it there are many different possible outcomes:
    1. it might automatically shut down and produce NO output
    2. there could be a drop in voltage.
    3. along with the drop in voltage there could be a loss of filtering - the nice smooth 5V DC could turn into 60Hz pulsed DC, which could cause all sorts of weird consequences.
    4. The charger might overheat and fail, potentially even causing a fire or damage to the tablet.
  18. Drivespinners

    Drivespinners Active Member

    .
  19. RobBlakemore

    RobBlakemore New Member

    Thanks for all the useful information here.
    I'm interested in the same question from a Galaxy Nexus point of view... checking it's okay to use a 2amp car charger with it.

    Some good links, if anyone can post them, to provide references to the above would be really useful if anyone's got any...
  20. Giuseppe Verdi

    Giuseppe Verdi New Member

    This thread has a lot of nonsense mixed in with the sense. Trust the engineers, who generally understand this stuff. Theory aside, here is some data: I charged my Motorola Razr Maxx HD with two different chargers, the one that came with the phone (rated at 5.1 volts, 850 mA) and one from an iPad (rated at 5.1 volts, 2.1 A). In both cases I used the USB cable that came with the phone (it plugs into the phone with a microUSB connector). The battery in the phone is rated at 3300 mAh.

    Using the stock charger from 43% full to 70% full, I got 811.5 mA, which is 95% of the nominal 850 mA.

    Using the iPad charger from 51% full to 87% full, I got 1114 mA, which is 131% of the nominal 850 mA, but only 53% of the nominal 2.1 A for the iPad charger. The phone did not get warm during this process.

    So there is definitely some current limiting both in the phone and in the charger. And if you want to charge a bit faster than the stock charger allows, you can do it with a higher-power charger (rated at the same voltage).
  21. JusStrollinAlong

    JusStrollinAlong New Member

    Folks, it IS OK to use a 2.1A charger with a lesser device such as a mobile phone, smartphone, iPhone, etc.

    And thanks should go out to sremick for the concise and logical line-by-line rebuttal of ShannonPricePhoto's misinformed submission. No offense intended toward ShannonPricePhoto. Your willingness to submit an answer is appreciated!

    Most mobile phone chargers are not really chargers, only power adapters that provide a power source for the charging circuitry which is almost always contained within the mobile phone.*

    With specific regards to devices using USB to charge: Chargers only OFFER power up to the limit they can produce. Devices only DRAW power as they need.

    The USB standard conforms to the following**:
    Signal 5 volt DC
    Max. voltage 5.00
  22. DocDizzy

    DocDizzy New Member

    I think we are missing some information here. There are three main concepts you must refer to in this situation. Wattage, Voltage, and Amps. Ohms are also important, but this is hard to determine. Ohms are the amount of voltage that can be delivered after the determination of the resistance of the channel. A good analogy of Ohms is referring to a river. The speed of the water moving in the river would be the voltage. The resistance of the wind, or rocks that impede the flow is Ohms. With this being said you can see different resistances based on the type of cable you use. A cable that is properly insulated can help resist interference from other devices such as running the cable over a high voltage cable that is producing EMI (Electromagnetic Interference) You want to use a decent cable, often these are cables provided by the manufacturer, but it isn't a rule.

    Amps are determined in a simple formula. Wattage divided by voltage. Wattage is determined by Voltage multiplied by wattage.

    Something to remember is the maximum wattage of your battery only shows how long the battery will run before it is drained. Example a battery with a rating of 9 wattage will produce consistent power of 1 Wh per hour minimum. It will also on the same hand produce a maximum of 9 Wh for one hour depending on the draw of the device.

    My cell phone for example runs at 3.8 volts, and has a max capacity of 9.88 Wh. 9.88Mh/3.8V=6.08Amps. This means the battery itself is at max only able to receive 6.08 Amps without causing dmg to the battery. A standard USB port will produce .5 to .8 milliamps, and normally 5V. This converts to 2.5 Watts. In the instance of my phone this is more than enough to charge the device while running if it's not using the max Wh. However if my device is pulling from the battery the max of 9.88 Wh, then even with it plugged in the device will still drain power.

    Voltage and Amps is ALWAYS determined by the transformer of the device. No matter if it's a wall charger, USB port on a PC, or an external charger.

    The device it's self also has the ability to regulate the maximum amount of voltage coming into the device. The device that allows this to take place is called a resistor that has a certain Ohms rating. (Remember the idea of the water flowing over the rocks.)

    Now the question about can having to many amps hurt the device. The answer is an over whelming YES. Charging at a rate of amps that is too high will put unneeded stress on the battery. The battery will get hot, and can melt, or even catch fire. If too much voltage is provided you will burn out that resistor and your device will pretty much be dead. As integrated circuits make it nearly impossible to replace these parts of the device.

    I have an external battery charger that runs at 5V and 2.1Amps. This produces 10.5 watts delivery. With the battery on my phone using a minimum of 1Wh, and a max of 9.88Wh I am safe as 10.5Watts -9.88Wh = 0.62Wh extra. This is not nearly enough watts overage to put stress on the battery.

    What you must pay attention to is Voltage * Amps, and the max Wh your battery can store.

    You are certainly safe.
  23. mehinu

    mehinu New Member

    Hello everyone,

    I have read several posts about car chargers available, OEM chargers etc...

    I would like to install a Permanent USB female socket into my car so I can charge anything that can charge via USB cable. Smartphones, tablets, GPS, Camera etc.....

    My smartphone -Sony Experia Z1 OEM charger output is 5V - 1.5amps.... and the most common chargers I am able to find online are 5v 3a max. I have read several posts that the device will use the current it needs and therefore that extra current available will not fry my devices. - As long as it is 5V of course :)

    Do anyone think it's a good idea?? My girlfriends phone OEM charger is rated at 370amps and therefore she will be using this charger too. I will wire this permanent usb socket to a switch and directly to the car battery, so I can switch on the charger via the switch and even lock the car and the device will still charge.

    Any help is appreciated! Thanks alot!!
  24. micnolmad

    micnolmad New Member

    To do ShannonPricePhoto a bit of justice, some of what he said is true coming from the rc world.

    See Sremic is talking about circuit protected devices where Shannon is not. In rc world batteries are raw devices aka no circuitry. So in that reguard it IS charger that dictates how the battery is charged.

    So I think Sremic, being so knowledgeable, should have acknowledged that shannon was right from his pov, only it was a wrong pov as the OP was asking for circuit controlled charging.
  25. dzimmerm

    dzimmerm New Member

    I read through the replies and wanted to add my comments. USB charging of android and apple phones is more complicated than charging a battery.

    If all you want to do is charge a battery you have to supply more voltage than the battery has and have sufficient current capacity from the voltage source. How fast the battery charges is determined by the difference in the voltage supplied, assuming your power supply can provide whatever current you need. The drawback of doing that is you can exceed the batteries ability to cool itself and cause the battery to fail quite spectacularly if you give it too much voltage.

    Since cell phone manufacturers are very against spectacular explosions of devices that are carried on ones person they have developed ways to limit overcharging.

    Android phones check to see how much current they can pull from a usb charger by looking at the two data pins. If the data pins have less than 200 ohms resistance across them it tries to pull up to 1.5 amps(1500ma) of charging current. If the data pins have a higher resistance it will only draw up to 1/2 amp (500ma) . The phone will limit the amount of current it pulls even at the 1.5 amp setting in order to keep the battery and phone from overheating.

    Apple uses a different standard than android and I don't have apple so if you do then be aware it is different.

    The relationship between current, voltage, and resistance is defined by Ohm (he was a guy with the last name of Ohm) law. The long word definition is "It takes 1 volt to drive 1 ampere of current through a resistance of 1 ohm."

    Watt, another guy's last name, has his own different law about power. Watt's law says that 1 amp at 1 volt is 1 watt of power.

    The reason I mention watts is because heat generated by charging is determined by how many watts (power) is being fed into a resistance or load.

    That is why resisters have both resistance and watt ratings. You need to have bigger components to handle higher power due to bigger components being able to dissipate more heat without failing spectacularly.

Share This Page