Related
Is it dangerous to charge nexus S with nexus One Charger ?
nexus s charger is just 700 mAh but nexus one charger is 1000 mAh .
so it can charge nexus s faster , but can this issue be harmfull for nexus s ?!
THnX
It likely is no problem since the charging is controlled by the phone and the charger only supplies the voltage/current. Just make sure that the voltage is the same.
yes the voltage is the same , 5 Volts
but the current is different , you mean that just voltge is important ?
also I had an htc nexus one car charger (because I have N1 too) , its voltage is again 5 volts but the current is 2000 mAh !!
after I saw that current I bought a samsung car charger it's exacly 5 volts nd 700 mAh .
I'm not sure about nexus s , does it endure with 2000 mAh ?
if it is possible I thinks it will charge nexus s 2 or 3 times faster !
Your charger is a voltage source. That means that it will try to keep the voltage at the specified 5V. How much current comes out depends on the resistance of the circuit.
The 700mA is the maximum current you can take out of it before the voltage begins to drop.
There are quite some people who use stronger chargers, and thinking of the battery there should not be any problems until at least 1500mA (which would be 1C charging rate for the battery).
Considering the charging characteristics of LiIon batteries I don't know if it will reach full charge any sooner, but some people reported faster charging.
cgi said:
Your charger is a voltage source. That means that it will try to keep the voltage at the specified 5V. How much current comes out depends on the resistance of the circuit.
The 700mA is the maximum current you can take out of it before the voltage begins to drop.
There are quite some people who use stronger chargers, and thinking of the battery there should not be any problems until at least 1500mA (which would be 1C charging rate for the battery).
Considering the charging characteristics of LiIon batteries I don't know if it will reach full charge any sooner, but some people reported faster charging.
Click to expand...
Click to collapse
Thnks! I didnt know about this.
Shouldnt be a problem unless voltages are different, i'm charging mine with a nokia 5800 cable connected to a USB port in the keyboard. Slow as heck but it charges.
The formula was :
W = V x I x T
W refers to power created or usage.
V refers to voltage should be constant at DC 5 volt, for ac should 110 v or 220 v.
I refers to flow. The flow in and out.
T refers to time or duration.
As u can see bigger flow will increase the result coz multiplied by time.
So on my opinion, it will not damage ur device as long the voltage stays at 5 volt. And it will charge faster.
But remember one thing. Don't charge too long, coz more time will create more power which ur device can't stand. For ex. Dont Charge overnight.
CMIIW
Sent from my Nexus S I9020T
rejanmanis said:
The formula was :
W = V x I x T
W refers to power created or usage.
V refers to voltage should be constant at DC 5 volt, for ac should 110 v or 220 v.
I refers to flow. The flow in and out.
T refers to time or duration.
As u can see bigger flow will increase the result coz multiplied by time.
Click to expand...
Click to collapse
You are oversimplifying. What you are saying is true only for an ideal ohmic load resistor. Your phone is anything but that. In our case the current I is dependent on the time (among others) and a non-trivial function.
So actually the power used/transmitted is
W = integral( V * I(t) * dT )
assuming that V is constant. I depends on the charging circuit, the charger, and how full the battery is at the moment. If you hit the current limit of the charger, then even V is not constant anymore, but a (unknown) function of I.
So on my opinion, it will not damage ur device as long the voltage stays at 5 volt.
Click to expand...
Click to collapse
This is likely true, assuming, that the charging circuit in the phone limits the current to a level safe for the whole power train. That would not only be the battery itself, but also the cable, the connector, the traces on the circuit board, the charging transistor, the inductor (assuming a switched mode charging, not sure whether it is) and so on.
And it will charge faster.
Click to expand...
Click to collapse
That depends on the charging circuit and the battery protection circuit.
But remember one thing. Don't charge too long, coz more time will create more power which ur device can't stand. For ex. Dont Charge overnight.
Click to expand...
Click to collapse
You are still stuck in the ages of constant current, dumb lead-acid battery chargers. Nowadays we have "intelligent" chargers that monitor the state of the battery to ensure that doesn't go up in smoke by overcharging or to high charge rates.
Any sane design is current limited by the on-board charging circuit. I don't see why Samsung should have built anything else.
I bought a portable charger/power bank, 6000mAh, 5.3V 1000mA
but i just found that my charger (EP800) has 5V, 850 mA
I read it somewhere that using a higher amperage charger should be just fine since my phone will still only use as much amperage as it is designed for, and using a higher voltage could possibly fry my battery.
But my portable charger is only 0,3V higher, is it ok? how bad can it goes?
do batteries have any tolerance of using higher voltage?
I usually let it charge over night. However I found today it is charged very fast: less than 2hours from 30% to 100%. I used galaxy note 2 usb charger. Do you have the same experience of charging or my battery has problems?
yes i same with you
When your battery was at 30%, it means the charger had to charge 1610mA (30% of 2300mA is 690mA). The Galaxy Note 2 charger has an output amperage of 2000mA. So you can imagine it won't take very long.
[update] Hm I misread note for tab. I have a tab 2 with a 2A charger. Not sure what the note 2 charger can output, but I'm guessing it will be above average.
Petrovski80 said:
When your battery was at 30%, it means the charger had to charge 1610mA (30% of 2300mA is 690mA). The Galaxy Note 2 charger has an output amperage of 2000mA. So you can imagine it won't take very long.
[update] Hm I misread note for tab. I have a tab 2 with a 2A charger. Not sure what the note 2 charger can output, but I'm guessing it will be above average.
Click to expand...
Click to collapse
Thanks. I will check the output of note 2 usb charger and do the math.
Stock Nexus 5 charger also charges it from 0 to 100% in less than 2 hours.
Dont forget that the devices kernel determines how much mA is drawn from a charger and not how much may a charger is rated for
-----------------------
Sent via tapatalk.
I do NOT reply to support queries over PM. Please keep support queries to the Q&A section, so that others may benefit
Not all milliamps are the same
It seems to be a common misconception that the number of milliamp-hours of your battery and the milliamp rating of your charger have a fixed relationship.They don't. It does not automatically follow that a 2000mAh battery will take 2 hours to charge from a 1000mA charger, or that the charge current will be 1000mA. Charge current can easily - and safely - be higher than the mA rating of the charger. Or lower.
The N5 battery is rated at 3.8V 2300mAh (typical) and, crucially, 8.74 watt hours. A 5V 1000mA charger can supply a maximum of 5 watts (5 volts x 1 amp). Voltage converters within the N5 change this 5 watts of power from 5V to 3.8V to suit the battery - and this could be at about 1250mA (assuming a not-unreasonable 95% conversion efficiency).
The battery voltage varies with the state of charge, reaching about 4.2V when fully charged. Even then, the charge current could be as high as 1130mA without drawing more than 1000mA from the 5V charger.
An earlier poster pointed out that charging is under control of the CPU (I suspect instead a dedicated charging circuit but that's irrelevant) and it is very likely that a) the charging current varies significantly during the charging cycle and b) it is unlikely that the charging circuit demands precisely the maximum that the charger can supply. But it is quite likely that the actual current being put into the battery is numerically higher than that being drawn from the source. It's the power in watts that counts, not the number of milliamps.
Batteries are not perfect, meaning you don't get out all you put in. If the battery was completely flat you would have to put in more than 8.74wh in to bring it up to full charge (although a totally flat Li-ion battery is dead beyond redemption; the battery life shown on the screen is the useable life, not ultimate power capacity).
Sometimes the charger rating, battery capacity and charge time seem to line up, but that's more due to a happy accident than anything else. A 40,000mA charger won't juice your phone from flat in four minutes!
Batteries, and charging, are complex...
G1MFG said:
It seems to be a common misconception that the number of milliamp-hours of your battery and the milliamp rating of your charger have a fixed relationship.They don't. It does not automatically follow that a 2000mAh battery will take 2 hours to charge from a 1000mA charger, or that the charge current will be 1000mA. Charge current can easily - and safely - be higher than the mA rating of the charger. Or lower.
The N5 battery is rated at 3.8V 2300mAh (typical) and, crucially, 8.74 watt hours. A 5V 1000mA charger can supply a maximum of 5 watts (5 volts x 1 amp). Voltage converters within the N5 change this 5 watts of power from 5V to 3.8V to suit the battery - and this could be at about 1250mA (assuming a not-unreasonable 95% conversion efficiency).
The battery voltage varies with the state of charge, reaching about 4.2V when fully charged. Even then, the charge current could be as high as 1130mA without drawing more than 1000mA from the 5V charger.
An earlier poster pointed out that charging is under control of the CPU (I suspect instead a dedicated charging circuit but that's irrelevant) and it is very likely that a) the charging current varies significantly during the charging cycle and b) it is unlikely that the charging circuit demands precisely the maximum that the charger can supply. But it is quite likely that the actual current being put into the battery is numerically higher than that being drawn from the source. It's the power in watts that counts, not the number of milliamps.
Batteries are not perfect, meaning you don't get out all you put in. If the battery was completely flat you would have to put in more than 8.74wh in to bring it up to full charge (although a totally flat Li-ion battery is dead beyond redemption; the battery life shown on the screen is the useable life, not ultimate power capacity).
Sometimes the charger rating, battery capacity and charge time seem to line up, but that's more due to a happy accident than anything else. A 40,000mA charger won't juice your phone from flat in four minutes!
Batteries, and charging, are complex...
Click to expand...
Click to collapse
This. Well said.
Your suspicions are correct, it does have a dedicated charging circuit. This chip is responsible for charging. Input current appears to be capped at 1200mA. Measured with my DMM last night and never saw the phone draw more than 960mA when charging with the screen off. It stayed like that until the battery was around 95% charged, then gradually tapered off from there as the battery reached 100%.
G1MFG said:
It seems to be a common misconception that the number of milliamp-hours of your battery and the milliamp rating of your charger have a fixed relationship.They don't. It does not automatically follow that a 2000mAh battery will take 2 hours to charge from a 1000mA charger, or that the charge current will be 1000mA. Charge current can easily - and safely - be higher than the mA rating of the charger. Or lower.
The N5 battery is rated at 3.8V 2300mAh (typical) and, crucially, 8.74 watt hours. A 5V 1000mA charger can supply a maximum of 5 watts (5 volts x 1 amp). Voltage converters within the N5 change this 5 watts of power from 5V to 3.8V to suit the battery - and this could be at about 1250mA (assuming a not-unreasonable 95% conversion efficiency).
The battery voltage varies with the state of charge, reaching about 4.2V when fully charged. Even then, the charge current could be as high as 1130mA without drawing more than 1000mA from the 5V charger.
An earlier poster pointed out that charging is under control of the CPU (I suspect instead a dedicated charging circuit but that's irrelevant) and it is very likely that a) the charging current varies significantly during the charging cycle and b) it is unlikely that the charging circuit demands precisely the maximum that the charger can supply. But it is quite likely that the actual current being put into the battery is numerically higher than that being drawn from the source. It's the power in watts that counts, not the number of milliamps.
Batteries are not perfect, meaning you don't get out all you put in. If the battery was completely flat you would have to put in more than 8.74wh in to bring it up to full charge (although a totally flat Li-ion battery is dead beyond redemption; the battery life shown on the screen is the useable life, not ultimate power capacity).
Sometimes the charger rating, battery capacity and charge time seem to line up, but that's more due to a happy accident than anything else. A 40,000mA charger won't juice your phone from flat in four minutes!
Batteries, and charging, are complex...
Click to expand...
Click to collapse
Thanks a lot. It did look complicated. As long as the fast charging is normal, I don't worry too much.
Can anyone recommend an app that shows real time current draw? It would also be cool if the app showed how much power the phone is using in real time.
Sent from my Nexus 5 using Tapatalk
G1MFG said:
It seems to be a common misconception that the number of milliamp-hours of your battery and the milliamp rating of your charger have a fixed relationship.They don't. It does not automatically follow that a 2000mAh battery will take 2 hours to charge from a 1000mA charger, or that the charge current will be 1000mA. Charge current can easily - and safely - be higher than the mA rating of the charger. Or lower.
The N5 battery is rated at 3.8V 2300mAh (typical) and, crucially, 8.74 watt hours. A 5V 1000mA charger can supply a maximum of 5 watts (5 volts x 1 amp). Voltage converters within the N5 change this 5 watts of power from 5V to 3.8V to suit the battery - and this could be at about 1250mA (assuming a not-unreasonable 95% conversion efficiency).
The battery voltage varies with the state of charge, reaching about 4.2V when fully charged. Even then, the charge current could be as high as 1130mA without drawing more than 1000mA from the 5V charger.
An earlier poster pointed out that charging is under control of the CPU (I suspect instead a dedicated charging circuit but that's irrelevant) and it is very likely that a) the charging current varies significantly during the charging cycle and b) it is unlikely that the charging circuit demands precisely the maximum that the charger can supply. But it is quite likely that the actual current being put into the battery is numerically higher than that being drawn from the source. It's the power in watts that counts, not the number of milliamps.
Batteries are not perfect, meaning you don't get out all you put in. If the battery was completely flat you would have to put in more than 8.74wh in to bring it up to full charge (although a totally flat Li-ion battery is dead beyond redemption; the battery life shown on the screen is the useable life, not ultimate power capacity).
Sometimes the charger rating, battery capacity and charge time seem to line up, but that's more due to a happy accident than anything else. A 40,000mA charger won't juice your phone from flat in four minutes!
Batteries, and charging, are complex...
Click to expand...
Click to collapse
True. I never said there was a fixed relationship though. They do have a loose relationship. Charging with a 500mA charger will take longer than charging with a 2000mA one, since about every modern phone accepts a charging limit higher than 500mA.
Another aspect not addressed in my reply is that the charge process isn't linear. But without going into too much electronics, I just wanted to explain to the OP he shouldn't have to worry if he notices differences in charging times when using chargers of different amperage output.
Today's batteries are much improved
wolfca said:
Thanks a lot. It did look complicated. As long as the fast charging is normal, I don't worry too much.
Click to expand...
Click to collapse
That's the ticket. When used with the correct charger, a modern phone battery takes a couple of hours to charge fully, a bit longer with a lower-rated charger. Or you can top up a bit if you have a few minutes spare. It's much better than the early mobiles with Ni-Cd batteries that took overnight to charge. And required weightlifting training before you could even pick them up!
Next week getting a new Note 4 - and zerolemon 10,000mah battery!
I need to deep cycle battery 6-8 times to get phone to display the current vbbatt % correctly. The manufacturer for zerolemon says turn off fast charge and charge 12hrs each time.
That being said, with fast charge off, can I use a 5v 3.5 amp charger I see on amazon and possibly charge the phone faster than 12 hrs like normal is using a 2.1 amp charger?
Anyone tried this with let's say even the stock battery does the battery actually charge faster due to the increased amps or would it be a waste and still charges at the slower 2.1 amps?
I never deep cycled the battery and the longest discharge I got was 1 week with 16 hours on screen time...you just need to make sure you use an updated kernel with the zL fix.
The amperage rating is the max the charger can put out. The Note will draw the same amperage on either charger because they are both 5V.
ackliph said:
The amperage rating is the max the charger can put out. The Note will draw the same amperage on either charger because they are both 5V.
Click to expand...
Click to collapse
But will the phone charge faster on 3.5 amps at 5v vs the 1.5 amp or 2.1 amp the stock charger puts out? I am getting ZeroLemon and need to deep cycle the battery a few times and am trying to have it fully charged in less than the standard 12 hrs - was hoping a charger with more amps would cut down on the 12 hrs lol
http://www.amazon.com/gp/product/B00WN86VYQ/ref=ox_sc_act_title_8?ie=UTF8&psc=1&smid=A3RPN0HBLXDN8Z
i just used the stock fast charger on my zerolemmon with fast charge on and i never let it go the full 1 hours. Ive had insane battery life with it so i dont think it matters
drtechnology said:
But will the phone charge faster on 3.5 amps at 5v vs the 1.5 amp or 2.1 amp the stock charger puts out? I am getting ZeroLemon and need to deep cycle the battery a few times and am trying to have it fully charged in less than the standard 12 hrs - was hoping a charger with more amps would cut down on the 12 hrs lol
http://www.amazon.com/gp/product/B00WN86VYQ/ref=ox_sc_act_title_8?ie=UTF8&psc=1&smid=A3RPN0HBLXDN8Z
Click to expand...
Click to collapse
Your Note 4 will draw a maximum of 1.9A on a 5V standard charger and a maximum of 1.66A on a 9V Quickcharger. If you multiply these values, you will get your "charging speed" in Watts: The maximum is 9.5W on normal charger and 15W on a quick charger.
Using a higher rated normal charger (eg 5V/3A) will NOT INCREASE charging speed. The Note 4 will never draw more than 1.9A on 5V.
Using a lower rated normal charger (eg 5V/1A) WILL DECREASE the charging speed. The Note 4 will notice that it cannot get 1.9A from the charger and drop the current. Bad and/or long cables can also influence the charging speed negatively.
Also noteworthy: Quick charging will only work when screen is off. As soon as you turn your screen on, the charging speed when connected to a quick charger will drop from 15W to an extremely slow 5W. The only fix for this horrible Samsung joke is a custom ROM like CyanogenMod.
You can charge your Zerolemon battery nicely with the original Quickcharger that came with the phone. That will do 15W and is as fast as you can possibly charge.
joeuser said:
Your Note 4 will draw a maximum of 1.9A on a 5V standard charger and a maximum of 1.66A on a 9V Quickcharger. If you multiply these values, you will get your "charging speed" in Watts: The maximum is 9.5W on normal charger and 15W on a quick charger.
Using a higher rated normal charger (eg 5V/3A) will NOT INCREASE charging speed. The Note 4 will never draw more than 1.9A on 5V.
Using a lower rated normal charger (eg 5V/1A) WILL DECREASE the charging speed. The Note 4 will notice that it cannot get 1.9A from the charger and drop the current. Bad and/or long cables can also influence the charging speed negatively.
Also noteworthy: Quick charging will only work when screen is off. As soon as you turn your screen on, the charging speed when connected to a quick charger will drop from 15W to an extremely slow 5W. The only fix for this horrible Samsung joke is a custom ROM like CyanogenMod.
You can charge your Zerolemon battery nicely with the original Quickcharger that came with the phone. That will do 15W and is as fast as you can possibly charge.
Click to expand...
Click to collapse
Wow. Good info. Thx. Now a ton with a hack to draw more amps lol. Hmmmm lolol
Hi, i've just purchased Xiaomi mi 5 and i have question regarding charging the phone. It supports Qualcomm Quick Charge 3.0 however the charger that was provided only supports QC 2.0.
I notice that whenever i plugged the phone to the charger that supports Qualcomm Quick Charge, my phone gets hot. I decided to do a little test, so I used an USB Charger Doctor that can read voltage and Amp that it delivered to the phone. I've also using Ampere to monitor my battery temperature. I'm gonna charge for about 36 minutes using Aukey charger. It has 3 ports with only 1 that supports QC 2.0, the rest of the two supports up to 5 volts at 2.4 amps.
Here's the result:
9 volts, 1.35 amps = 56% battery charged from 8%. Battery temps reach 43,2 C. Juice added, 48%.
5 volts, 1.71 amps = 43% battery charged from 7%. Battery temps reach just about 35 C. Juice added 36%.
It's clear that qualcomm quick charge is faster than the standard port, however it resulting in higher temps. And i know the fact that Lithium batteries hate high temperature. So which is the best for battery life? Higher Voltage lesser Amps, or Lower Voltage Higher Amps? Thanks in advance.