Charging Nexus S Faster - Nexus S Q&A, Help & Troubleshooting

Is it dangerous to charge nexus S with nexus One Charger ?
nexus s charger is just 700 mAh but nexus one charger is 1000 mAh .
so it can charge nexus s faster , but can this issue be harmfull for nexus s ?!
THnX

It likely is no problem since the charging is controlled by the phone and the charger only supplies the voltage/current. Just make sure that the voltage is the same.

yes the voltage is the same , 5 Volts
but the current is different , you mean that just voltge is important ?
also I had an htc nexus one car charger (because I have N1 too) , its voltage is again 5 volts but the current is 2000 mAh !!
after I saw that current I bought a samsung car charger it's exacly 5 volts nd 700 mAh .
I'm not sure about nexus s , does it endure with 2000 mAh ?
if it is possible I thinks it will charge nexus s 2 or 3 times faster !

Your charger is a voltage source. That means that it will try to keep the voltage at the specified 5V. How much current comes out depends on the resistance of the circuit.
The 700mA is the maximum current you can take out of it before the voltage begins to drop.
There are quite some people who use stronger chargers, and thinking of the battery there should not be any problems until at least 1500mA (which would be 1C charging rate for the battery).
Considering the charging characteristics of LiIon batteries I don't know if it will reach full charge any sooner, but some people reported faster charging.

cgi said:
Your charger is a voltage source. That means that it will try to keep the voltage at the specified 5V. How much current comes out depends on the resistance of the circuit.
The 700mA is the maximum current you can take out of it before the voltage begins to drop.
There are quite some people who use stronger chargers, and thinking of the battery there should not be any problems until at least 1500mA (which would be 1C charging rate for the battery).
Considering the charging characteristics of LiIon batteries I don't know if it will reach full charge any sooner, but some people reported faster charging.
Click to expand...
Click to collapse
Thnks! I didnt know about this.

Shouldnt be a problem unless voltages are different, i'm charging mine with a nokia 5800 cable connected to a USB port in the keyboard. Slow as heck but it charges.

The formula was :
W = V x I x T
W refers to power created or usage.
V refers to voltage should be constant at DC 5 volt, for ac should 110 v or 220 v.
I refers to flow. The flow in and out.
T refers to time or duration.
As u can see bigger flow will increase the result coz multiplied by time.
So on my opinion, it will not damage ur device as long the voltage stays at 5 volt. And it will charge faster.
But remember one thing. Don't charge too long, coz more time will create more power which ur device can't stand. For ex. Dont Charge overnight.
CMIIW
Sent from my Nexus S I9020T

rejanmanis said:
The formula was :
W = V x I x T
W refers to power created or usage.
V refers to voltage should be constant at DC 5 volt, for ac should 110 v or 220 v.
I refers to flow. The flow in and out.
T refers to time or duration.
As u can see bigger flow will increase the result coz multiplied by time.
Click to expand...
Click to collapse
You are oversimplifying. What you are saying is true only for an ideal ohmic load resistor. Your phone is anything but that. In our case the current I is dependent on the time (among others) and a non-trivial function.
So actually the power used/transmitted is
W = integral( V * I(t) * dT )
assuming that V is constant. I depends on the charging circuit, the charger, and how full the battery is at the moment. If you hit the current limit of the charger, then even V is not constant anymore, but a (unknown) function of I.
So on my opinion, it will not damage ur device as long the voltage stays at 5 volt.
Click to expand...
Click to collapse
This is likely true, assuming, that the charging circuit in the phone limits the current to a level safe for the whole power train. That would not only be the battery itself, but also the cable, the connector, the traces on the circuit board, the charging transistor, the inductor (assuming a switched mode charging, not sure whether it is) and so on.
And it will charge faster.
Click to expand...
Click to collapse
That depends on the charging circuit and the battery protection circuit.
But remember one thing. Don't charge too long, coz more time will create more power which ur device can't stand. For ex. Dont Charge overnight.
Click to expand...
Click to collapse
You are still stuck in the ages of constant current, dumb lead-acid battery chargers. Nowadays we have "intelligent" chargers that monitor the state of the battery to ensure that doesn't go up in smoke by overcharging or to high charge rates.
Any sane design is current limited by the on-board charging circuit. I don't see why Samsung should have built anything else.

Related

High powered USB car charger?

Hi all,
Can anyone recommend a high mAh output usb car charger? It has to be one with a detachable USB lead.
The one I have currently takes forever just to charge the device by a 1% increment. It doesn't also seem to provide enough power when for example I have sat-nav/GPS running (the device still drops in battery power).
Thanks.
dont even think....
dont even think about it.... i got a charger that does 2 amps instead of 1 amp and guess what my battery blew up!
So what's optimal/maximum amp rating that I can use?
The one I have I would say is pretty much useless when using battery hungry applications/services.
Just tried to check my existing charger but there is no rating on it.
Would I able right in saying the following:
A charger with a 1000 mAh, would charge my battery by 1000 mA in a hour?
I believe HTC official chargers have a rating of 1000 mAh too right? Mine one may well be 500 I would guess.
How quick do other peoples car charger charge their Diamonds?
sh500 said:
So what's optimal/maximum amp rating that I can use?
The one I have I would say is pretty much useless when using battery hungry applications/services.
Just tried to check my existing charger but there is no rating on it.
Would I able right in saying the following:
A charger with a 1000 mAh, would charge my battery by 1000 mA in a hour?
I believe HTC official chargers have a rating of 1000 mAh too right? Mine one may well be 500 I would guess.
How quick do other peoples car charger charge their Diamonds?
Click to expand...
Click to collapse
A charger's specification would never indicate the mAh ( milliamp hour)rating, but would indicate the maximum current it can supply while maintaining an operating voltage (for usb its 5Volts.)
in answer to your question: yes your charger needs to supply more current when you have your Diamond operating and charging at the same time. not all chargers are made equal. some may max out by 500mA, therefore your diamond wont charge at all if its on. as far as I know, most chargers are rated to supply 2A (or 2000mA)
another thing: your diamond uses its own charging circuitry to recharge and maintain its battery. just because a charging adapter says it charges at 1000mAh, i doubt it would actually recharge your battery from 0% capacity to full% capacity in an hour(it just doesnt work that way, and if it did, then your battery could blow up).
as for my own diamond, i seems that it takes around 3-4 hours to get from 0% to full when it is off and using my stock 950mAh.
doing a little math here: 950mAh / 4 hours = ~250mA
therefor in order to recharge your battery, the charging adapter needs to supply 250mA.
but if your diamond is ON and you want to recharge then your charging adapter needs to supply 250mA AND and additional amount of current to maintain your diamonds power.
if youre still able to follow with what im saying here, you may conclude that you just have a DUD charger and you should just buy another one.
as for the other guy who said that a 2Ah charger blew his battery up. I'm a bit skeptical. I think your chargering circuit in your diamond is more likely to fry before blowing a battery up (and if a lithium battery blew up it would have taken out his entire diamond).
Yep, that all makes sense.
By chance, My battery (1800mAH) totally died last night. Put it on car charger and after almost exactly a hours worth of charging, the battery indicated 1% (!) Mind TomTom was running for about 30 minutes of that.
Ok time to buy a new higher rated charger I think. Any recommendations for one with a USB port on it?
Thanks.
i've been looking for one liek that on e-bay as well but i cannot seem to find one. having a detachable usb cord would be nice, but now that i think about it maybe i am better off finding one with a non detachable cable in the event that I dont have a usb cable around.
Yeah, I wouldn't normally mind one with an attached cord but the setup in my car is such that I already have a semi hard wired a usb from a 12v supply and have the [USB] cable hidden then have it pop out near to my car holder.
bingo
http://cgi.ebay.ca/USB-Cable-Car-Ch...|66:2|65:12|39:1|240:1318|301:0|293:12|294:50
Check out Avantek. This charger works so much faster than any other charger I have. My Note goes from zero to hero in no time flat.

.7 Amp vs 1.0 Amp Charger

So the One X comes with a 1.0 Amp charger. The charger would not fit(too long) in the spot where I have been using my Samsung charger.
I compared the two and saw the only difference is the .7 A to 1.0 A. looked up if it was ok to use. Some even reported longer battery life with the slower charger. Makes some sense just lasts longer.
Maybe I am crazy but it seems like I get better battery life from the .7 charger.
I have went back and forth a few times....But of course not enough days to really tell.
thoughts?
Definitely overthinking it. It will just charge slower.
eallan said:
Definitely overthinking it. It will just charge slower.
Click to expand...
Click to collapse
Yep, what he said.
To think of it another way, you can charge your phone via a USB connection to your computer, but it's much slower. This is because USB only provides 0.5 (or maybe it's 0.2?) Amps.
However, going with an adapter that's HIGHER than 1.0A could cause damage.
In the long run, slower charges will likely make your battery last more cycles. But on a per-cycle basis, it should not give better battery life.
If you run your battery down, the smaller charger (and definitely the case for PC USB because they max out at 500ma) may not be able to charge at all. In normal cases, it is fine.
Higher than 1a won't damage anything. .7 might not actually charge the battery 100% depending on if the phone thinks it is usb(500ma charging mode) or ac charger(wont charge properly).
Sent from my Desire HD using XDA
c5satellite2 said:
Higher than 1a won't damage anything. .7 might not actually charge the battery 100% depending on if the phone thinks it is usb(500ma charging mode) or ac charger(wont charge properly).
Sent from my Desire HD using XDA
Click to expand...
Click to collapse
This. The voltage is what really matters. The phone won't draw more than it can to charge. If it draws 1A while charging and you put it on a 2A charger, it will pull 1A. But if the voltages don't match up and there isn't a protection circuit for that type of problem, you could fry the electronics with a higher amperage.
100% on the voltage! That is the important one. .7a shouldnt be recognized as a real ac charger resulting in the phone thinking it is USB 500ma charging mode which is the same as pc, slow. If the device thinks the .7a charger is ac charging mode, the battery probably will never charge to 100%. In fact if you were at 100%, and the phone was in ac charging mode with a .7a charger, it would drain your battery! More than 1a, the phone will only draw what it is capable of. Some aftermarket chargers will still charge in USB 500ma mode because the phone won't recognize them as ac charging 1a mode. I had an old HTC charger once that did funny things to my inspire, it would open navigation every time it was plugged in(thought it was a car dock I guess). Use real OEM chargers if possible. They will charge the fastest, and you won't have issues. The real HTC chargers at 1a are much faster than the aftermarket chargers I have tried. I have a good Kensington 1a car and wall charger, and the HTC blows them away as far as speed!
Sent from my Desire HD using XDA
c5satellite2 said:
100% on the voltage! That is the important one. .7a shouldnt be recognized as a real ac charger resulting in the phone thinking it is USB 500ma charging mode which is the same as pc, slow. If the device thinks the .7a charger is ac charging mode, the battery probably will never charge to 100%.
Click to expand...
Click to collapse
Actually, AC charging mode or PC USB mode is not determined by amprage or volts (all use 5v). It is determined by how the data pins are terminated. Most non-Apple chargers terminate data pins the same way so they will be recognized as AC charger. Apple chargers are the ones that may not be recognized by the phone as AC chargers because Apple terminates data PINs different than everyone else. Regardless of the charging mode, they all can charge to 100% if they can initiate the charge.
LiIon battery charging circuitary is much more complicated than your normal AA chargers. It is regulated internally so that slightly variations of input voltage won't affect the charging. It has to internally control the charge process precisely so that it can terminate the process at the exact moment (overcharging results in explusion).
Input power supply being 0.7A or 1A has no bearing on the level to which the battery is filled. Even input voltage has no bearing! The only impact to the system will be the rate at which you can charge.
Not true. Some chargers make the device think it is a real ac charger, but do not provide enough to charge to 100%. And yes it has to do with pinout. Some ac chargers aren't recognized properly and results in USB charge mode even if they have more than 500ma available. Even if your pc provides more than the USB standard of 500ma, it will only draw 500ma and charge slowly.
Don't mess with the voltage, it WILL damage your device! The previous post should be ignored, and deleted.
Sent from my Desire HD using XDA
I use my touchpad charger and its 5.3v instead of 5v and its definitely charging my phone faster than the HTC stock charger. No affect on battery life though.
Sent from my HTC One X using XDA
5.3v or 5v is ok because there is a little leeway, and I'm sure the 5.3 isn't exactly 5.3 anyway. If it was, the extra .3v is probably within range, and is not blowing things up, just stressing them a little more, making some heat, and wearing them out slightly quicker.
Try a 12v charger, see what happens. LOL.
Seriously the 5.3v might actually be closer to 5v than a charger labeled 5v and therefore could result in better charging. Remember, input voltage varies as well. Your electricity could go under 100v or as high as 120v. Our electric grid is not very consistent. Supply is constantly being adjusted to meet demand, resulting in widely varying voltages. I have seen it dip into the low 90v range on hot summer afternoons, and rarely in the 115v-120v range where it should be. The 5.0 volt charger would result in the best performance if it was actually putting out a TRUE 5v. A high quality charger, with high quality, consistent, 115v input power that is properly rated is ideal. In reality it doesn't exist.
5.3v close enough, might actually be better. One way to tell, put the voltmeter on it!
If you have access to volt/multimeter, could you post your ac voltage at the outlet and the dc voltage coming out of the 5.3v charger. Could be interesting. How many amps is the charger also? Might have to get one.
Sent from my Desire HD using XDA
Dont have that but this is the link to it
http://www.amazon.com/gp/aw/d/B0055QYJJM
I'll provide the exact specs when I get home.
Sent from my HTC One X using XDA
Can you clarify what you mean by this?
c5satellite2 said:
Not true. Some chargers make the device think it is a real ac charger, but do not provide enough to charge to 100%.
Sent from my Desire HD using XDA
Click to expand...
Click to collapse
Either an input power provides power or it doesn't. The only reason it would "stop" is if the charger in the phone runs out of headroom and I have yet to see this with any AC/DC or USB supply.
The whole issue of whether or not the phone identifies the power supply is an entirely separate discussion. But once it does identify and begin charging it will do so until completion.
---------- Post added at 12:13 PM ---------- Previous post was at 12:05 PM ----------
c5satellite2 said:
The 5.0 volt charger would result in the best performance if it was actually putting out a TRUE 5v.
Sent from my Desire HD using XDA
Click to expand...
Click to collapse
This is not true. Take two AC/DC adapters...
A) [email protected] = 4.24W
B) [email protected] =4.00W
The HTC One X has an internal switching charger with dynamic input power limiting. So it will actually be able to draw MORE current from Adapter A than Adapter B. Also, because the charger in the One X is a switching charger there will be negligible extra heat generated and no excessive wear and tear.
how did you know?
racerex said:
Can you clarify what you mean by this?
Either an input power provides power or it doesn't. The only reason it would "stop" is if the charger in the phone runs out of headroom and I have yet to see this with any AC/DC or USB supply.
The whole issue of whether or not the phone identifies the power supply is an entirely separate discussion. But once it does identify and begin charging it will do so until completion.
---------- Post added at 12:13 PM ---------- Previous post was at 12:05 PM ----------
This is not true. Take two AC/DC adapters...
A) [email protected] = 4.24W
B) [email protected] =4.00W
The HTC One X has an internal switching charger with dynamic input power limiting. So it will actually be able to draw MORE current from Adapter A than Adapter B. Also, because the charger in the One X is a switching charger there will be negligible extra heat generated and no excessive wear and tear.
Click to expand...
Click to collapse
Htc has input power limiting? Internal switching charger? Where did you get this information?
The plugpack will be voltage regulated(to protect from overvoltage with under designed load) not ampage regulated.
The battery charges via voltage float, the higher the voltage the quicker the charge, but the voltage is regulated anyways so there is no fast and slow charge. It is charging or it's not. The charge circuit steps up and down the voltage to give fast/slow charge not the plugpack in mobile phones.
1.0a versus 0.7a just means the rated output is lower. It does not change the load. All that happens is the 0.7a will be running over its rated output which results in running hotter and potential running undervoltage.
Running 0.7a is not good if the battery charge circuit will draw 1.0a by design. Your plugpack just becomes a firerisk.
You should always match the designed specifications, eg 12v 1.0a.
The device expects a 12v(or close) input and should be able to draw 1.0a without issue.
If you over or under volt the charge circuit it could blow up, if you over draw the supply it might melt.
is it ok to charge 0.7A battery with 1.0A charger
omer101 said:
is it ok to charge 0.7A battery with 1.0A charger
Click to expand...
Click to collapse
Did you not read the thread?
Sent from my Evita

[Q] Fast charge battery

I usually let it charge over night. However I found today it is charged very fast: less than 2hours from 30% to 100%. I used galaxy note 2 usb charger. Do you have the same experience of charging or my battery has problems?
yes i same with you
When your battery was at 30%, it means the charger had to charge 1610mA (30% of 2300mA is 690mA). The Galaxy Note 2 charger has an output amperage of 2000mA. So you can imagine it won't take very long.
[update] Hm I misread note for tab. I have a tab 2 with a 2A charger. Not sure what the note 2 charger can output, but I'm guessing it will be above average.
Petrovski80 said:
When your battery was at 30%, it means the charger had to charge 1610mA (30% of 2300mA is 690mA). The Galaxy Note 2 charger has an output amperage of 2000mA. So you can imagine it won't take very long.
[update] Hm I misread note for tab. I have a tab 2 with a 2A charger. Not sure what the note 2 charger can output, but I'm guessing it will be above average.
Click to expand...
Click to collapse
Thanks. I will check the output of note 2 usb charger and do the math.
Stock Nexus 5 charger also charges it from 0 to 100% in less than 2 hours.
Dont forget that the devices kernel determines how much mA is drawn from a charger and not how much may a charger is rated for
-----------------------
Sent via tapatalk.
I do NOT reply to support queries over PM. Please keep support queries to the Q&A section, so that others may benefit
Not all milliamps are the same
It seems to be a common misconception that the number of milliamp-hours of your battery and the milliamp rating of your charger have a fixed relationship.They don't. It does not automatically follow that a 2000mAh battery will take 2 hours to charge from a 1000mA charger, or that the charge current will be 1000mA. Charge current can easily - and safely - be higher than the mA rating of the charger. Or lower.
The N5 battery is rated at 3.8V 2300mAh (typical) and, crucially, 8.74 watt hours. A 5V 1000mA charger can supply a maximum of 5 watts (5 volts x 1 amp). Voltage converters within the N5 change this 5 watts of power from 5V to 3.8V to suit the battery - and this could be at about 1250mA (assuming a not-unreasonable 95% conversion efficiency).
The battery voltage varies with the state of charge, reaching about 4.2V when fully charged. Even then, the charge current could be as high as 1130mA without drawing more than 1000mA from the 5V charger.
An earlier poster pointed out that charging is under control of the CPU (I suspect instead a dedicated charging circuit but that's irrelevant) and it is very likely that a) the charging current varies significantly during the charging cycle and b) it is unlikely that the charging circuit demands precisely the maximum that the charger can supply. But it is quite likely that the actual current being put into the battery is numerically higher than that being drawn from the source. It's the power in watts that counts, not the number of milliamps.
Batteries are not perfect, meaning you don't get out all you put in. If the battery was completely flat you would have to put in more than 8.74wh in to bring it up to full charge (although a totally flat Li-ion battery is dead beyond redemption; the battery life shown on the screen is the useable life, not ultimate power capacity).
Sometimes the charger rating, battery capacity and charge time seem to line up, but that's more due to a happy accident than anything else. A 40,000mA charger won't juice your phone from flat in four minutes!
Batteries, and charging, are complex...
G1MFG said:
It seems to be a common misconception that the number of milliamp-hours of your battery and the milliamp rating of your charger have a fixed relationship.They don't. It does not automatically follow that a 2000mAh battery will take 2 hours to charge from a 1000mA charger, or that the charge current will be 1000mA. Charge current can easily - and safely - be higher than the mA rating of the charger. Or lower.
The N5 battery is rated at 3.8V 2300mAh (typical) and, crucially, 8.74 watt hours. A 5V 1000mA charger can supply a maximum of 5 watts (5 volts x 1 amp). Voltage converters within the N5 change this 5 watts of power from 5V to 3.8V to suit the battery - and this could be at about 1250mA (assuming a not-unreasonable 95% conversion efficiency).
The battery voltage varies with the state of charge, reaching about 4.2V when fully charged. Even then, the charge current could be as high as 1130mA without drawing more than 1000mA from the 5V charger.
An earlier poster pointed out that charging is under control of the CPU (I suspect instead a dedicated charging circuit but that's irrelevant) and it is very likely that a) the charging current varies significantly during the charging cycle and b) it is unlikely that the charging circuit demands precisely the maximum that the charger can supply. But it is quite likely that the actual current being put into the battery is numerically higher than that being drawn from the source. It's the power in watts that counts, not the number of milliamps.
Batteries are not perfect, meaning you don't get out all you put in. If the battery was completely flat you would have to put in more than 8.74wh in to bring it up to full charge (although a totally flat Li-ion battery is dead beyond redemption; the battery life shown on the screen is the useable life, not ultimate power capacity).
Sometimes the charger rating, battery capacity and charge time seem to line up, but that's more due to a happy accident than anything else. A 40,000mA charger won't juice your phone from flat in four minutes!
Batteries, and charging, are complex...
Click to expand...
Click to collapse
This. Well said.
Your suspicions are correct, it does have a dedicated charging circuit. This chip is responsible for charging. Input current appears to be capped at 1200mA. Measured with my DMM last night and never saw the phone draw more than 960mA when charging with the screen off. It stayed like that until the battery was around 95% charged, then gradually tapered off from there as the battery reached 100%.
G1MFG said:
It seems to be a common misconception that the number of milliamp-hours of your battery and the milliamp rating of your charger have a fixed relationship.They don't. It does not automatically follow that a 2000mAh battery will take 2 hours to charge from a 1000mA charger, or that the charge current will be 1000mA. Charge current can easily - and safely - be higher than the mA rating of the charger. Or lower.
The N5 battery is rated at 3.8V 2300mAh (typical) and, crucially, 8.74 watt hours. A 5V 1000mA charger can supply a maximum of 5 watts (5 volts x 1 amp). Voltage converters within the N5 change this 5 watts of power from 5V to 3.8V to suit the battery - and this could be at about 1250mA (assuming a not-unreasonable 95% conversion efficiency).
The battery voltage varies with the state of charge, reaching about 4.2V when fully charged. Even then, the charge current could be as high as 1130mA without drawing more than 1000mA from the 5V charger.
An earlier poster pointed out that charging is under control of the CPU (I suspect instead a dedicated charging circuit but that's irrelevant) and it is very likely that a) the charging current varies significantly during the charging cycle and b) it is unlikely that the charging circuit demands precisely the maximum that the charger can supply. But it is quite likely that the actual current being put into the battery is numerically higher than that being drawn from the source. It's the power in watts that counts, not the number of milliamps.
Batteries are not perfect, meaning you don't get out all you put in. If the battery was completely flat you would have to put in more than 8.74wh in to bring it up to full charge (although a totally flat Li-ion battery is dead beyond redemption; the battery life shown on the screen is the useable life, not ultimate power capacity).
Sometimes the charger rating, battery capacity and charge time seem to line up, but that's more due to a happy accident than anything else. A 40,000mA charger won't juice your phone from flat in four minutes!
Batteries, and charging, are complex...
Click to expand...
Click to collapse
Thanks a lot. It did look complicated. As long as the fast charging is normal, I don't worry too much.
Can anyone recommend an app that shows real time current draw? It would also be cool if the app showed how much power the phone is using in real time.
Sent from my Nexus 5 using Tapatalk
G1MFG said:
It seems to be a common misconception that the number of milliamp-hours of your battery and the milliamp rating of your charger have a fixed relationship.They don't. It does not automatically follow that a 2000mAh battery will take 2 hours to charge from a 1000mA charger, or that the charge current will be 1000mA. Charge current can easily - and safely - be higher than the mA rating of the charger. Or lower.
The N5 battery is rated at 3.8V 2300mAh (typical) and, crucially, 8.74 watt hours. A 5V 1000mA charger can supply a maximum of 5 watts (5 volts x 1 amp). Voltage converters within the N5 change this 5 watts of power from 5V to 3.8V to suit the battery - and this could be at about 1250mA (assuming a not-unreasonable 95% conversion efficiency).
The battery voltage varies with the state of charge, reaching about 4.2V when fully charged. Even then, the charge current could be as high as 1130mA without drawing more than 1000mA from the 5V charger.
An earlier poster pointed out that charging is under control of the CPU (I suspect instead a dedicated charging circuit but that's irrelevant) and it is very likely that a) the charging current varies significantly during the charging cycle and b) it is unlikely that the charging circuit demands precisely the maximum that the charger can supply. But it is quite likely that the actual current being put into the battery is numerically higher than that being drawn from the source. It's the power in watts that counts, not the number of milliamps.
Batteries are not perfect, meaning you don't get out all you put in. If the battery was completely flat you would have to put in more than 8.74wh in to bring it up to full charge (although a totally flat Li-ion battery is dead beyond redemption; the battery life shown on the screen is the useable life, not ultimate power capacity).
Sometimes the charger rating, battery capacity and charge time seem to line up, but that's more due to a happy accident than anything else. A 40,000mA charger won't juice your phone from flat in four minutes!
Batteries, and charging, are complex...
Click to expand...
Click to collapse
True. I never said there was a fixed relationship though. They do have a loose relationship. Charging with a 500mA charger will take longer than charging with a 2000mA one, since about every modern phone accepts a charging limit higher than 500mA.
Another aspect not addressed in my reply is that the charge process isn't linear. But without going into too much electronics, I just wanted to explain to the OP he shouldn't have to worry if he notices differences in charging times when using chargers of different amperage output.
Today's batteries are much improved
wolfca said:
Thanks a lot. It did look complicated. As long as the fast charging is normal, I don't worry too much.
Click to expand...
Click to collapse
That's the ticket. When used with the correct charger, a modern phone battery takes a couple of hours to charge fully, a bit longer with a lower-rated charger. Or you can top up a bit if you have a few minutes spare. It's much better than the early mobiles with Ni-Cd batteries that took overnight to charge. And required weightlifting training before you could even pick them up!

Faster charging?

i recently bought the P900 (wifi version).
full charge will take around 5 hours, which in practice translates to 4 hours (i never get to 0% and charging from 90%\95% and on will be slowed down by the device anyway).
is there any way to speed up the charging?
like buying a 5.3V 3A charger. will the OEM cable be able to transfer the additional current?
could the device even take advantage from a 3A charger?
if so, can you recommend on any?
its important to me because i always use 100% brightness.
No. In the past mobile devices (mostly phones) shipped with cheap 500ma chargers and bumping up to higher amperage chargers would have an affect on charge time. Those days are gone as charging efficiency of chargers and cost to produce have lead to included chargers being optimized for charging times. Charging circuitry in the devices is going to take what it's rated to take and no more, so once a charger is plugged into it that's rated the same as the device is designed to take there's little else that can be done to speed up charging.
Bottom line - the charger that came with the tablet if it's the official one (i.e. if you bought new, not used and someone included the wrong one) is optimized to charge the tablet at the fastest rate. Based upon the numbers you noted your charge times are not excessive, the tablet is designed to take around 2A and it won't take 3A even if the charger is rated for it.
If you want faster charging you need to sell your tablet and get a Snapdragon variant instead (LTE tablets from various carriers) or start practicing better battery management to reduce how depleted your tablet gets. For me that means not running at highest brightness unless I really need it and topping off the battery whenever I can. When I get really low and I have a reasonably long period that I can charge I'll sometimes shut the tablet completely down rather than put it to sleep so that charging is accomplished with near zero load on the battery.
oh, bummer.
well, i guess i would have to learn how to live with that.
TY for your reply.
im planning on buying a 2 port charger so i wont have to carry so many stuff with me,
how much slower the device will charge with a 5.0V charger?
should i look for a 2 port 5.3v charger? a normal device wont have troubles with that?
It's not the voltage it's the amps. If you want to charge two devices simultaneously as quickly as possible the power supply needs to be rated to output the wattage necessary to provide the amperage the devices will draw for maximum charge rate.
My recommendation is to find something capable of over 20 watts (2A x 5V = 20watts). I'd buy this for future Qualcomm quick charge use.
https://www.anker.com/products/A2031111
Sent from my SM-P900 using Tapatalk
my question was how much slower the note pro will charge with a 5.0v 2A charger as opposed to the OEM one which is 5.3v 2A.
and if there is any problem to use a 5.3v charger with a normal smartphone.
charging the note pro is more important to me than my other devices.
Yonany said:
my question was how much slower the note pro will charge with a 5.0v 2A charger as opposed to the OEM one which is 5.3v 2A.
and if there is any problem to use a 5.3v charger with a normal smartphone.
charging the note pro is more important to me than my other devices.
Click to expand...
Click to collapse
Yes but you also noted that you want to buy a 2 port version and I'm saying that the voltage is only part of the equation. Unless you are already aware that you need one rated at 2A simultaneously (you didn't specify). I honestly never measured between the two, I do not worry about 5V vs 5.3V since the charging voltage of the lithium ion cells is under 5V anyway. AFAIK the current is more critical. Maybe someone else more knowledgeable in electrical engineering can chime in since I'm unsure how the charging circuit within the phone will step down the voltage from the charger to the battery. All I know is if one tops off regularly or charges overnight there's no night and day difference between the stock 5.3V charger and a 5V one so long as the aftermarket one is rated 2A or more.
Sent from my SM-P900 using Tapatalk

How does charging a phone battery work?

1. For example, if a phone comes with a charger rated at 5V and 0.7A, when it's plugged in to charge, what dictates the current the phone draws, is it the resistance of the phone?
2. If I = V/R, do phones typically provide little resistance so that the current is the max the charger can provide? i.e in the above example, if the phone was off, would it constantly be drawing 0.7A, and if the charger was changed with one rated at 5V and 2A, would the phone draw more than 0.7A? could it reach 2A?
3...bit of a side question, but when the phone is done charging, how does it stop drawing current? again if I = V/R, does the phone have to alter the amount of resistance it is providing? how does it do that?
I'm only looking for fairly simple answers to be honest as this is just a general query and not something I need to go in depth with.
Thanks.
bubu1 said:
I'm only looking for fairly simple answers to be honest as this is just a general query and not something I need to go in depth with.
Thanks.
Click to expand...
Click to collapse
Of course, I'm not in the secrets of the companies who build phones, but, as an electronician, I can answer general questions.
First, you have to know that the charger itself has no Idea of size of the battery he has to charge. It is just a regulated source of voltage, with a limitation to a maximum current. The charger limit the voltage at 5V, and, when you ask more amps than what he is supposed to send, It will reduce the Voltage, in order to limit the current to this max value.
It is the phone itself witch regulate the way it charge its battery. It follows complex curves, depending of the way it is programmed, fast charge etc... Of course, if the source limits the current, it will do its best with this max current. For the phone, a 5V unllimited current source is ideal. The fast charge consist to send the max current that the battery is able to afford at the begining of the charge (when empty) then limit progressivaly this current when the battery approach the end of the charge. This is for ION-Lithium batteries. Old Cadmium Nickel, by example, used the "delta peak" method. The battery was feed by a constant current (1A for an 1AH battery) and the voltage was monitored. This voltage increased with the charge, then after a maximum, began to decrease while the chage was at 90%. At this time, the current was reduced to 1/20 in order to never overcharge the battery.

Categories

Resources