[Q] What is my actual mAH on my new battery? - General Questions and Answers

Hi.
I recently bought a new Fat battery for my desire. It promises to deliver 2600 mAH but i have my doubts...
First of all, only the voltage is printed on the battery itself.
Second, I experience worse uptime than on my original 1400 mAH battery.
Third, I guess Iam just sceptical by nature.
So my question: Is there an app out there that will display max capacity?
Or do I really have to measure it with a multimeter?

I wouldn't know how to measure it other than just using it and guaging the results, but i'm very skeptical about the efficacy of any after-market battery tbh.

You can't test for capacity, but you can calculate it.
Run the battery down until the phone won't turn on.
Plug in the power adaptor and start timing.
Don't turn on the phone.
Note how long it took to finish charging.
Convert that number into hours, so 1 hour 20 minutes would be 1.33, then divide it by 1.15 to factor in energy loss through the charger. We'll call this number Fred. No reason, Fred's just an easy name to remember.
Look on your charger for the output current, something like 500mA or 700mA or 1000mA, or 1A.
Multiply the output current by Fred.
So, say it takes 1.5hours to charge, the Fred number is 1.3
If your output current is 700mA, then the capacity is 913mAH
Repeat the process a few times and take the average capacity.

Related

High powered USB car charger?

Hi all,
Can anyone recommend a high mAh output usb car charger? It has to be one with a detachable USB lead.
The one I have currently takes forever just to charge the device by a 1% increment. It doesn't also seem to provide enough power when for example I have sat-nav/GPS running (the device still drops in battery power).
Thanks.
dont even think....
dont even think about it.... i got a charger that does 2 amps instead of 1 amp and guess what my battery blew up!
So what's optimal/maximum amp rating that I can use?
The one I have I would say is pretty much useless when using battery hungry applications/services.
Just tried to check my existing charger but there is no rating on it.
Would I able right in saying the following:
A charger with a 1000 mAh, would charge my battery by 1000 mA in a hour?
I believe HTC official chargers have a rating of 1000 mAh too right? Mine one may well be 500 I would guess.
How quick do other peoples car charger charge their Diamonds?
sh500 said:
So what's optimal/maximum amp rating that I can use?
The one I have I would say is pretty much useless when using battery hungry applications/services.
Just tried to check my existing charger but there is no rating on it.
Would I able right in saying the following:
A charger with a 1000 mAh, would charge my battery by 1000 mA in a hour?
I believe HTC official chargers have a rating of 1000 mAh too right? Mine one may well be 500 I would guess.
How quick do other peoples car charger charge their Diamonds?
Click to expand...
Click to collapse
A charger's specification would never indicate the mAh ( milliamp hour)rating, but would indicate the maximum current it can supply while maintaining an operating voltage (for usb its 5Volts.)
in answer to your question: yes your charger needs to supply more current when you have your Diamond operating and charging at the same time. not all chargers are made equal. some may max out by 500mA, therefore your diamond wont charge at all if its on. as far as I know, most chargers are rated to supply 2A (or 2000mA)
another thing: your diamond uses its own charging circuitry to recharge and maintain its battery. just because a charging adapter says it charges at 1000mAh, i doubt it would actually recharge your battery from 0% capacity to full% capacity in an hour(it just doesnt work that way, and if it did, then your battery could blow up).
as for my own diamond, i seems that it takes around 3-4 hours to get from 0% to full when it is off and using my stock 950mAh.
doing a little math here: 950mAh / 4 hours = ~250mA
therefor in order to recharge your battery, the charging adapter needs to supply 250mA.
but if your diamond is ON and you want to recharge then your charging adapter needs to supply 250mA AND and additional amount of current to maintain your diamonds power.
if youre still able to follow with what im saying here, you may conclude that you just have a DUD charger and you should just buy another one.
as for the other guy who said that a 2Ah charger blew his battery up. I'm a bit skeptical. I think your chargering circuit in your diamond is more likely to fry before blowing a battery up (and if a lithium battery blew up it would have taken out his entire diamond).
Yep, that all makes sense.
By chance, My battery (1800mAH) totally died last night. Put it on car charger and after almost exactly a hours worth of charging, the battery indicated 1% (!) Mind TomTom was running for about 30 minutes of that.
Ok time to buy a new higher rated charger I think. Any recommendations for one with a USB port on it?
Thanks.
i've been looking for one liek that on e-bay as well but i cannot seem to find one. having a detachable usb cord would be nice, but now that i think about it maybe i am better off finding one with a non detachable cable in the event that I dont have a usb cable around.
Yeah, I wouldn't normally mind one with an attached cord but the setup in my car is such that I already have a semi hard wired a usb from a 12v supply and have the [USB] cable hidden then have it pop out near to my car holder.
bingo
http://cgi.ebay.ca/USB-Cable-Car-Ch...|66:2|65:12|39:1|240:1318|301:0|293:12|294:50
Check out Avantek. This charger works so much faster than any other charger I have. My Note goes from zero to hero in no time flat.

is it true? Higher Amperes = Faster charging time?

please help because my charger is rated @ 0.1A which means 100mA only? (wth) I googled and found out that 700mA is the oem batt charger for xperia x1 some use 1A others 1.5A please help thanks. Cause I'm buying a new charger rated at 1A so it would charge faster
Yes, the higher the mA the more juice can be drawn from it.
Many people think that it means it will only put out that current, but current is drawn not pushed, and an electrical device will draw as much as it needs.
With a NiMH type rechargable battery, there's a simple formula to work out the charging time.
C is the capacity of the battery.
1.4C/mA
So a 1000mAH NiMH battery charged at 1000mA would take 1.4 hours to charge.
However, Lithium batteries are not simple to charge without blowing them up, hence the need for a charging circuit.
The charging circuit should take only as much current as it needs to charge the battery safely, so a 2A charger would probably be overkill although it would most likely enable you to run TomTom, Opera, and watch a movie while charging in the quickest possible time
If you're charging while using the device heavily (GPS/Wifi/3G browsing) then 1A charger would be better, but if you normally just leave the phone charging without using it then 600mA normally does the job.
i think 2A charger would kill the battery. Can you suggest a 1A charger OEM htc for my xperia x1? thanks
henryfranz2005 said:
i think 2A charger would kill the battery. Can you suggest a 1A charger OEM htc for my xperia x1? thanks
Click to expand...
Click to collapse
You've fallen for what I mentioned in my second sentance.
A 2A power supply does not only provide 2A, it provides anything up to 2A.
So if your phone only draws 1A, it will only provide 1A.
The phone is the charger, the thing we think of as the charger is actually just a power supply.
Unless someone has the spec sheets for the charging circuit in the phone, we don't know the maximum rate at which it will charge the battery.
One way to find out would be to discharge the battery to a level where the phone won't turn on, then without turning the phone on, set it charging.
Time how long it takes for the LED to turn green.
Divide 1230 by the time in hours that it took and you've got roughly the current drawn to charge it.
Say it takes 90 minutes with a 1A power supply, so that's a maximum charge rate of 800mA, so even if you connected it up to a 5A power supply, it will still only charge at 800mA.
So, you connect it up to your 1A power supply, that means with the phone on you've got a "spare" 200mA to play with.
If the phone isn't using more than 200mA to just "run" itself, then you'll charge a battery in the 90 minutes.
However, say you start your sat nav app, it draws 400mA (guess), the charging circuit drops to use only 600mA, taking longer to charge but allowing you to find where you're driving too.
While you're navigating to a restaurant, you want to phone ahead to confirm the reservation, so you open up Opera and search google with a 3G connection, that takes another 400mA (guess). The charging circuit now only has 200mA to use. Your battery isn't getting much charge.
Imagine using a 600mA power supply instead and you can see how you could get to the situation where despite being plugged in, your battery is running down.
I've used 400mA to demonstrate the impact, of course they real values are lower, otherwise you'd only get an hour's use out of having GPS and 3G enabled. Hmm, then again...
thanks for helping me here I decided that I would buy a new charger. (1A) because my charger is not drawing enough amperes (my charger is rated at 0.1A believe me. I thought I read the specs wrong. But it takes roughly 18 hrs to fully charge my battery.
I input my battery specs here http://www.csgnetwork.com/batterychgcalc.html
and yeah I think the computation is correct. Thanks SIR XACCERS
xaccers said:
You've fallen for what I mentioned in my second sentance.
A 2A power supply does not only provide 2A, it provides anything up to 2A.
So if your phone only draws 1A, it will only provide 1A.
The phone is the charger, the thing we think of as the charger is actually just a power supply.
Unless someone has the spec sheets for the charging circuit in the phone, we don't know the maximum rate at which it will charge the battery.
One way to find out would be to discharge the battery to a level where the phone won't turn on, then without turning the phone on, set it charging.
Time how long it takes for the LED to turn green.
Divide 1230 by the time in hours that it took and you've got roughly the current drawn to charge it.
Say it takes 90 minutes with a 1A power supply, so that's a maximum charge rate of 800mA, so even if you connected it up to a 5A power supply, it will still only charge at 800mA.
So, you connect it up to your 1A power supply, that means with the phone on you've got a "spare" 200mA to play with.
If the phone isn't using more than 200mA to just "run" itself, then you'll charge a battery in the 90 minutes.
However, say you start your sat nav app, it draws 400mA (guess), the charging circuit drops to use only 600mA, taking longer to charge but allowing you to find where you're driving too.
While you're navigating to a restaurant, you want to phone ahead to confirm the reservation, so you open up Opera and search google with a 3G connection, that takes another 400mA (guess). The charging circuit now only has 200mA to use. Your battery isn't getting much charge.
Imagine using a 600mA power supply instead and you can see how you could get to the situation where despite being plugged in, your battery is running down.
I've used 400mA to demonstrate the impact, of course they real values are lower, otherwise you'd only get an hour's use out of having GPS and 3G enabled. Hmm, then again...
Click to expand...
Click to collapse
you're correct sir. It would take 90MINUTES to charge my phone using 1A
you're so cool sir
henryfranz2005 said:
you're correct sir. It would take 90MINUTES to charge my phone using 1A
you're so cool sir
Click to expand...
Click to collapse
Happy to have enlightened
Just wish I could have been more helpful in suggesting which to buy.
Just be careful in using a charger with high Amp rating. I have 2 chargers- 1 charges my phone in more or less an hour, the other in almost half a day (so I don't use it).
One time, my battery got drained, so no problem, I plugged it in to charge. To my horror, it wasn't charging (no blinking light). sometimes I get a blinking red light, and the power button emits a red light. So I wasn't all that bothered, I thought it might need some more time to charge, so I left it alone. But the day was fast ending, without anything happening, and I needed my phone the next day for work. So I went to have it checked, the tech said it was a battery problem, so I just bought a replacement battery.
After 2 days, the same thing happened. Not charging, Red light blinking and annoying me to death. I went and had the battery replaced again.
A few days passed, so far so good, nothing happened. I just made sure that I don't let my battery drain and charge it as soon as it falls below half.
One night, I attended a party and wasn't paying too much attention to my phone. You can guess what happened, the battery went dead. I couldn't find my (fast-charging) charger, so I used the other one while I looked for it- still wasn't charging.
I couldn't find it, so I got ready to go have the battery replaced again. But then it blinked. I thought my mind might be playing tricks with me. It blinked again. (actually it wasn't blinking, it was kind of like that slow color-changing when you open the phone). I pressed the power button. It's alive!
This happened several times already, so to make the long story short. It's the charger's fault. Now, its the charger with the low amphere rating that I bring with me, even if it does charge slowly. I only use the other one, when I'm pressed for time.
Sorry for the long post. Just wanted to share my story.
Sounds like it's the fault of the battery monitor in the phone letting the voltage of the battery drop too low damaging the cell.
A low current charge can often bring such a damaged cell back to life, where as a full current charge is likely to expose the damage and kill off the battery.
There are several things which damage lithium cells.
Heat is one of them, which is why if you're using a laptop that allows it, it's better to run off the mains where possible with the battery out. Of course this isn't always practical and if someone knocks the power lead off goes your laptop. With our phone's it's not an option. Charging also produces heat, the higher the current the hotter it gets, so short top-up charges are better than long charges.
Discharging them too low damages them. The phone should prevent this by stopping you being able to power on the phone if the voltage is too low, however it could be misreading the voltage. Sometimes they can be revived if the voltage hasn't dropped too far below the minimum, with a low current charge, but the damage would have been done so the battery wouldn't last as long as an undamaged one treated the same way and of the same age.
Time. It's a killer. From the moment of manufacture the battery's internal contacts start losing efficiency, giving the result of lower capacity over time. Heat increases this. There's nothing you can really do about it, just remember there's no point buying a spare battery to use in the future when your original one finally stops holding enough charge, by then the spare would have degraded too, so buy replacement batteries when you need them, not before.
xaccers said:
You've fallen for what I mentioned in my second sentance.
A 2A power supply does not only provide 2A, it provides anything up to 2A.
So if your phone only draws 1A, it will only provide 1A.
The phone is the charger, the thing we think of as the charger is actually just a power supply.
Unless someone has the spec sheets for the charging circuit in the phone, we don't know the maximum rate at which it will charge the battery.
One way to find out would be to discharge the battery to a level where the phone won't turn on, then without turning the phone on, set it charging.
Time how long it takes for the LED to turn green.
Divide 1230 by the time in hours that it took and you've got roughly the current drawn to charge it.
Say it takes 90 minutes with a 1A power supply, so that's a maximum charge rate of 800mA, so even if you connected it up to a 5A power supply, it will still only charge at 800mA.
So, you connect it up to your 1A power supply, that means with the phone on you've got a "spare" 200mA to play with.
If the phone isn't using more than 200mA to just "run" itself, then you'll charge a battery in the 90 minutes.
However, say you start your sat nav app, it draws 400mA (guess), the charging circuit drops to use only 600mA, taking longer to charge but allowing you to find where you're driving too.
While you're navigating to a restaurant, you want to phone ahead to confirm the reservation, so you open up Opera and search google with a 3G connection, that takes another 400mA (guess). The charging circuit now only has 200mA to use. Your battery isn't getting much charge.
Imagine using a 600mA power supply instead and you can see how you could get to the situation where despite being plugged in, your battery is running down.
I've used 400mA to demonstrate the impact, of course they real values are lower, otherwise you'd only get an hour's use out of having GPS and 3G enabled. Hmm, then again...
Click to expand...
Click to collapse
Spot on! you really hit it...look at it againas in this analogy, you have a 2mm diameter water pipe and being fed from a 10mm diameter pipe, you cant get into the 2mm more than it could take. and reversing the scenerio, inference could be drawn!
bR

Charging Nexus S Faster

Is it dangerous to charge nexus S with nexus One Charger ?
nexus s charger is just 700 mAh but nexus one charger is 1000 mAh .
so it can charge nexus s faster , but can this issue be harmfull for nexus s ?!
THnX
It likely is no problem since the charging is controlled by the phone and the charger only supplies the voltage/current. Just make sure that the voltage is the same.
yes the voltage is the same , 5 Volts
but the current is different , you mean that just voltge is important ?
also I had an htc nexus one car charger (because I have N1 too) , its voltage is again 5 volts but the current is 2000 mAh !!
after I saw that current I bought a samsung car charger it's exacly 5 volts nd 700 mAh .
I'm not sure about nexus s , does it endure with 2000 mAh ?
if it is possible I thinks it will charge nexus s 2 or 3 times faster !
Your charger is a voltage source. That means that it will try to keep the voltage at the specified 5V. How much current comes out depends on the resistance of the circuit.
The 700mA is the maximum current you can take out of it before the voltage begins to drop.
There are quite some people who use stronger chargers, and thinking of the battery there should not be any problems until at least 1500mA (which would be 1C charging rate for the battery).
Considering the charging characteristics of LiIon batteries I don't know if it will reach full charge any sooner, but some people reported faster charging.
cgi said:
Your charger is a voltage source. That means that it will try to keep the voltage at the specified 5V. How much current comes out depends on the resistance of the circuit.
The 700mA is the maximum current you can take out of it before the voltage begins to drop.
There are quite some people who use stronger chargers, and thinking of the battery there should not be any problems until at least 1500mA (which would be 1C charging rate for the battery).
Considering the charging characteristics of LiIon batteries I don't know if it will reach full charge any sooner, but some people reported faster charging.
Click to expand...
Click to collapse
Thnks! I didnt know about this.
Shouldnt be a problem unless voltages are different, i'm charging mine with a nokia 5800 cable connected to a USB port in the keyboard. Slow as heck but it charges.
The formula was :
W = V x I x T
W refers to power created or usage.
V refers to voltage should be constant at DC 5 volt, for ac should 110 v or 220 v.
I refers to flow. The flow in and out.
T refers to time or duration.
As u can see bigger flow will increase the result coz multiplied by time.
So on my opinion, it will not damage ur device as long the voltage stays at 5 volt. And it will charge faster.
But remember one thing. Don't charge too long, coz more time will create more power which ur device can't stand. For ex. Dont Charge overnight.
CMIIW
Sent from my Nexus S I9020T
rejanmanis said:
The formula was :
W = V x I x T
W refers to power created or usage.
V refers to voltage should be constant at DC 5 volt, for ac should 110 v or 220 v.
I refers to flow. The flow in and out.
T refers to time or duration.
As u can see bigger flow will increase the result coz multiplied by time.
Click to expand...
Click to collapse
You are oversimplifying. What you are saying is true only for an ideal ohmic load resistor. Your phone is anything but that. In our case the current I is dependent on the time (among others) and a non-trivial function.
So actually the power used/transmitted is
W = integral( V * I(t) * dT )
assuming that V is constant. I depends on the charging circuit, the charger, and how full the battery is at the moment. If you hit the current limit of the charger, then even V is not constant anymore, but a (unknown) function of I.
So on my opinion, it will not damage ur device as long the voltage stays at 5 volt.
Click to expand...
Click to collapse
This is likely true, assuming, that the charging circuit in the phone limits the current to a level safe for the whole power train. That would not only be the battery itself, but also the cable, the connector, the traces on the circuit board, the charging transistor, the inductor (assuming a switched mode charging, not sure whether it is) and so on.
And it will charge faster.
Click to expand...
Click to collapse
That depends on the charging circuit and the battery protection circuit.
But remember one thing. Don't charge too long, coz more time will create more power which ur device can't stand. For ex. Dont Charge overnight.
Click to expand...
Click to collapse
You are still stuck in the ages of constant current, dumb lead-acid battery chargers. Nowadays we have "intelligent" chargers that monitor the state of the battery to ensure that doesn't go up in smoke by overcharging or to high charge rates.
Any sane design is current limited by the on-board charging circuit. I don't see why Samsung should have built anything else.

[Q] Fast charge battery

I usually let it charge over night. However I found today it is charged very fast: less than 2hours from 30% to 100%. I used galaxy note 2 usb charger. Do you have the same experience of charging or my battery has problems?
yes i same with you
When your battery was at 30%, it means the charger had to charge 1610mA (30% of 2300mA is 690mA). The Galaxy Note 2 charger has an output amperage of 2000mA. So you can imagine it won't take very long.
[update] Hm I misread note for tab. I have a tab 2 with a 2A charger. Not sure what the note 2 charger can output, but I'm guessing it will be above average.
Petrovski80 said:
When your battery was at 30%, it means the charger had to charge 1610mA (30% of 2300mA is 690mA). The Galaxy Note 2 charger has an output amperage of 2000mA. So you can imagine it won't take very long.
[update] Hm I misread note for tab. I have a tab 2 with a 2A charger. Not sure what the note 2 charger can output, but I'm guessing it will be above average.
Click to expand...
Click to collapse
Thanks. I will check the output of note 2 usb charger and do the math.
Stock Nexus 5 charger also charges it from 0 to 100% in less than 2 hours.
Dont forget that the devices kernel determines how much mA is drawn from a charger and not how much may a charger is rated for
-----------------------
Sent via tapatalk.
I do NOT reply to support queries over PM. Please keep support queries to the Q&A section, so that others may benefit
Not all milliamps are the same
It seems to be a common misconception that the number of milliamp-hours of your battery and the milliamp rating of your charger have a fixed relationship.They don't. It does not automatically follow that a 2000mAh battery will take 2 hours to charge from a 1000mA charger, or that the charge current will be 1000mA. Charge current can easily - and safely - be higher than the mA rating of the charger. Or lower.
The N5 battery is rated at 3.8V 2300mAh (typical) and, crucially, 8.74 watt hours. A 5V 1000mA charger can supply a maximum of 5 watts (5 volts x 1 amp). Voltage converters within the N5 change this 5 watts of power from 5V to 3.8V to suit the battery - and this could be at about 1250mA (assuming a not-unreasonable 95% conversion efficiency).
The battery voltage varies with the state of charge, reaching about 4.2V when fully charged. Even then, the charge current could be as high as 1130mA without drawing more than 1000mA from the 5V charger.
An earlier poster pointed out that charging is under control of the CPU (I suspect instead a dedicated charging circuit but that's irrelevant) and it is very likely that a) the charging current varies significantly during the charging cycle and b) it is unlikely that the charging circuit demands precisely the maximum that the charger can supply. But it is quite likely that the actual current being put into the battery is numerically higher than that being drawn from the source. It's the power in watts that counts, not the number of milliamps.
Batteries are not perfect, meaning you don't get out all you put in. If the battery was completely flat you would have to put in more than 8.74wh in to bring it up to full charge (although a totally flat Li-ion battery is dead beyond redemption; the battery life shown on the screen is the useable life, not ultimate power capacity).
Sometimes the charger rating, battery capacity and charge time seem to line up, but that's more due to a happy accident than anything else. A 40,000mA charger won't juice your phone from flat in four minutes!
Batteries, and charging, are complex...
G1MFG said:
It seems to be a common misconception that the number of milliamp-hours of your battery and the milliamp rating of your charger have a fixed relationship.They don't. It does not automatically follow that a 2000mAh battery will take 2 hours to charge from a 1000mA charger, or that the charge current will be 1000mA. Charge current can easily - and safely - be higher than the mA rating of the charger. Or lower.
The N5 battery is rated at 3.8V 2300mAh (typical) and, crucially, 8.74 watt hours. A 5V 1000mA charger can supply a maximum of 5 watts (5 volts x 1 amp). Voltage converters within the N5 change this 5 watts of power from 5V to 3.8V to suit the battery - and this could be at about 1250mA (assuming a not-unreasonable 95% conversion efficiency).
The battery voltage varies with the state of charge, reaching about 4.2V when fully charged. Even then, the charge current could be as high as 1130mA without drawing more than 1000mA from the 5V charger.
An earlier poster pointed out that charging is under control of the CPU (I suspect instead a dedicated charging circuit but that's irrelevant) and it is very likely that a) the charging current varies significantly during the charging cycle and b) it is unlikely that the charging circuit demands precisely the maximum that the charger can supply. But it is quite likely that the actual current being put into the battery is numerically higher than that being drawn from the source. It's the power in watts that counts, not the number of milliamps.
Batteries are not perfect, meaning you don't get out all you put in. If the battery was completely flat you would have to put in more than 8.74wh in to bring it up to full charge (although a totally flat Li-ion battery is dead beyond redemption; the battery life shown on the screen is the useable life, not ultimate power capacity).
Sometimes the charger rating, battery capacity and charge time seem to line up, but that's more due to a happy accident than anything else. A 40,000mA charger won't juice your phone from flat in four minutes!
Batteries, and charging, are complex...
Click to expand...
Click to collapse
This. Well said.
Your suspicions are correct, it does have a dedicated charging circuit. This chip is responsible for charging. Input current appears to be capped at 1200mA. Measured with my DMM last night and never saw the phone draw more than 960mA when charging with the screen off. It stayed like that until the battery was around 95% charged, then gradually tapered off from there as the battery reached 100%.
G1MFG said:
It seems to be a common misconception that the number of milliamp-hours of your battery and the milliamp rating of your charger have a fixed relationship.They don't. It does not automatically follow that a 2000mAh battery will take 2 hours to charge from a 1000mA charger, or that the charge current will be 1000mA. Charge current can easily - and safely - be higher than the mA rating of the charger. Or lower.
The N5 battery is rated at 3.8V 2300mAh (typical) and, crucially, 8.74 watt hours. A 5V 1000mA charger can supply a maximum of 5 watts (5 volts x 1 amp). Voltage converters within the N5 change this 5 watts of power from 5V to 3.8V to suit the battery - and this could be at about 1250mA (assuming a not-unreasonable 95% conversion efficiency).
The battery voltage varies with the state of charge, reaching about 4.2V when fully charged. Even then, the charge current could be as high as 1130mA without drawing more than 1000mA from the 5V charger.
An earlier poster pointed out that charging is under control of the CPU (I suspect instead a dedicated charging circuit but that's irrelevant) and it is very likely that a) the charging current varies significantly during the charging cycle and b) it is unlikely that the charging circuit demands precisely the maximum that the charger can supply. But it is quite likely that the actual current being put into the battery is numerically higher than that being drawn from the source. It's the power in watts that counts, not the number of milliamps.
Batteries are not perfect, meaning you don't get out all you put in. If the battery was completely flat you would have to put in more than 8.74wh in to bring it up to full charge (although a totally flat Li-ion battery is dead beyond redemption; the battery life shown on the screen is the useable life, not ultimate power capacity).
Sometimes the charger rating, battery capacity and charge time seem to line up, but that's more due to a happy accident than anything else. A 40,000mA charger won't juice your phone from flat in four minutes!
Batteries, and charging, are complex...
Click to expand...
Click to collapse
Thanks a lot. It did look complicated. As long as the fast charging is normal, I don't worry too much.
Can anyone recommend an app that shows real time current draw? It would also be cool if the app showed how much power the phone is using in real time.
Sent from my Nexus 5 using Tapatalk
G1MFG said:
It seems to be a common misconception that the number of milliamp-hours of your battery and the milliamp rating of your charger have a fixed relationship.They don't. It does not automatically follow that a 2000mAh battery will take 2 hours to charge from a 1000mA charger, or that the charge current will be 1000mA. Charge current can easily - and safely - be higher than the mA rating of the charger. Or lower.
The N5 battery is rated at 3.8V 2300mAh (typical) and, crucially, 8.74 watt hours. A 5V 1000mA charger can supply a maximum of 5 watts (5 volts x 1 amp). Voltage converters within the N5 change this 5 watts of power from 5V to 3.8V to suit the battery - and this could be at about 1250mA (assuming a not-unreasonable 95% conversion efficiency).
The battery voltage varies with the state of charge, reaching about 4.2V when fully charged. Even then, the charge current could be as high as 1130mA without drawing more than 1000mA from the 5V charger.
An earlier poster pointed out that charging is under control of the CPU (I suspect instead a dedicated charging circuit but that's irrelevant) and it is very likely that a) the charging current varies significantly during the charging cycle and b) it is unlikely that the charging circuit demands precisely the maximum that the charger can supply. But it is quite likely that the actual current being put into the battery is numerically higher than that being drawn from the source. It's the power in watts that counts, not the number of milliamps.
Batteries are not perfect, meaning you don't get out all you put in. If the battery was completely flat you would have to put in more than 8.74wh in to bring it up to full charge (although a totally flat Li-ion battery is dead beyond redemption; the battery life shown on the screen is the useable life, not ultimate power capacity).
Sometimes the charger rating, battery capacity and charge time seem to line up, but that's more due to a happy accident than anything else. A 40,000mA charger won't juice your phone from flat in four minutes!
Batteries, and charging, are complex...
Click to expand...
Click to collapse
True. I never said there was a fixed relationship though. They do have a loose relationship. Charging with a 500mA charger will take longer than charging with a 2000mA one, since about every modern phone accepts a charging limit higher than 500mA.
Another aspect not addressed in my reply is that the charge process isn't linear. But without going into too much electronics, I just wanted to explain to the OP he shouldn't have to worry if he notices differences in charging times when using chargers of different amperage output.
Today's batteries are much improved
wolfca said:
Thanks a lot. It did look complicated. As long as the fast charging is normal, I don't worry too much.
Click to expand...
Click to collapse
That's the ticket. When used with the correct charger, a modern phone battery takes a couple of hours to charge fully, a bit longer with a lower-rated charger. Or you can top up a bit if you have a few minutes spare. It's much better than the early mobiles with Ni-Cd batteries that took overnight to charge. And required weightlifting training before you could even pick them up!

How does charging a phone battery work?

1. For example, if a phone comes with a charger rated at 5V and 0.7A, when it's plugged in to charge, what dictates the current the phone draws, is it the resistance of the phone?
2. If I = V/R, do phones typically provide little resistance so that the current is the max the charger can provide? i.e in the above example, if the phone was off, would it constantly be drawing 0.7A, and if the charger was changed with one rated at 5V and 2A, would the phone draw more than 0.7A? could it reach 2A?
3...bit of a side question, but when the phone is done charging, how does it stop drawing current? again if I = V/R, does the phone have to alter the amount of resistance it is providing? how does it do that?
I'm only looking for fairly simple answers to be honest as this is just a general query and not something I need to go in depth with.
Thanks.
bubu1 said:
I'm only looking for fairly simple answers to be honest as this is just a general query and not something I need to go in depth with.
Thanks.
Click to expand...
Click to collapse
Of course, I'm not in the secrets of the companies who build phones, but, as an electronician, I can answer general questions.
First, you have to know that the charger itself has no Idea of size of the battery he has to charge. It is just a regulated source of voltage, with a limitation to a maximum current. The charger limit the voltage at 5V, and, when you ask more amps than what he is supposed to send, It will reduce the Voltage, in order to limit the current to this max value.
It is the phone itself witch regulate the way it charge its battery. It follows complex curves, depending of the way it is programmed, fast charge etc... Of course, if the source limits the current, it will do its best with this max current. For the phone, a 5V unllimited current source is ideal. The fast charge consist to send the max current that the battery is able to afford at the begining of the charge (when empty) then limit progressivaly this current when the battery approach the end of the charge. This is for ION-Lithium batteries. Old Cadmium Nickel, by example, used the "delta peak" method. The battery was feed by a constant current (1A for an 1AH battery) and the voltage was monitored. This voltage increased with the charge, then after a maximum, began to decrease while the chage was at 90%. At this time, the current was reduced to 1/20 in order to never overcharge the battery.

Categories

Resources