1. For example, if a phone comes with a charger rated at 5V and 0.7A, when it's plugged in to charge, what dictates the current the phone draws, is it the resistance of the phone?
2. If I = V/R, do phones typically provide little resistance so that the current is the max the charger can provide? i.e in the above example, if the phone was off, would it constantly be drawing 0.7A, and if the charger was changed with one rated at 5V and 2A, would the phone draw more than 0.7A? could it reach 2A?
3...bit of a side question, but when the phone is done charging, how does it stop drawing current? again if I = V/R, does the phone have to alter the amount of resistance it is providing? how does it do that?
I'm only looking for fairly simple answers to be honest as this is just a general query and not something I need to go in depth with.
Thanks.
bubu1 said:
I'm only looking for fairly simple answers to be honest as this is just a general query and not something I need to go in depth with.
Thanks.
Click to expand...
Click to collapse
Of course, I'm not in the secrets of the companies who build phones, but, as an electronician, I can answer general questions.
First, you have to know that the charger itself has no Idea of size of the battery he has to charge. It is just a regulated source of voltage, with a limitation to a maximum current. The charger limit the voltage at 5V, and, when you ask more amps than what he is supposed to send, It will reduce the Voltage, in order to limit the current to this max value.
It is the phone itself witch regulate the way it charge its battery. It follows complex curves, depending of the way it is programmed, fast charge etc... Of course, if the source limits the current, it will do its best with this max current. For the phone, a 5V unllimited current source is ideal. The fast charge consist to send the max current that the battery is able to afford at the begining of the charge (when empty) then limit progressivaly this current when the battery approach the end of the charge. This is for ION-Lithium batteries. Old Cadmium Nickel, by example, used the "delta peak" method. The battery was feed by a constant current (1A for an 1AH battery) and the voltage was monitored. This voltage increased with the charge, then after a maximum, began to decrease while the chage was at 90%. At this time, the current was reduced to 1/20 in order to never overcharge the battery.
Related
Hi.
I recently bought a new Fat battery for my desire. It promises to deliver 2600 mAH but i have my doubts...
First of all, only the voltage is printed on the battery itself.
Second, I experience worse uptime than on my original 1400 mAH battery.
Third, I guess Iam just sceptical by nature.
So my question: Is there an app out there that will display max capacity?
Or do I really have to measure it with a multimeter?
I wouldn't know how to measure it other than just using it and guaging the results, but i'm very skeptical about the efficacy of any after-market battery tbh.
You can't test for capacity, but you can calculate it.
Run the battery down until the phone won't turn on.
Plug in the power adaptor and start timing.
Don't turn on the phone.
Note how long it took to finish charging.
Convert that number into hours, so 1 hour 20 minutes would be 1.33, then divide it by 1.15 to factor in energy loss through the charger. We'll call this number Fred. No reason, Fred's just an easy name to remember.
Look on your charger for the output current, something like 500mA or 700mA or 1000mA, or 1A.
Multiply the output current by Fred.
So, say it takes 1.5hours to charge, the Fred number is 1.3
If your output current is 700mA, then the capacity is 913mAH
Repeat the process a few times and take the average capacity.
Is it dangerous to charge nexus S with nexus One Charger ?
nexus s charger is just 700 mAh but nexus one charger is 1000 mAh .
so it can charge nexus s faster , but can this issue be harmfull for nexus s ?!
THnX
It likely is no problem since the charging is controlled by the phone and the charger only supplies the voltage/current. Just make sure that the voltage is the same.
yes the voltage is the same , 5 Volts
but the current is different , you mean that just voltge is important ?
also I had an htc nexus one car charger (because I have N1 too) , its voltage is again 5 volts but the current is 2000 mAh !!
after I saw that current I bought a samsung car charger it's exacly 5 volts nd 700 mAh .
I'm not sure about nexus s , does it endure with 2000 mAh ?
if it is possible I thinks it will charge nexus s 2 or 3 times faster !
Your charger is a voltage source. That means that it will try to keep the voltage at the specified 5V. How much current comes out depends on the resistance of the circuit.
The 700mA is the maximum current you can take out of it before the voltage begins to drop.
There are quite some people who use stronger chargers, and thinking of the battery there should not be any problems until at least 1500mA (which would be 1C charging rate for the battery).
Considering the charging characteristics of LiIon batteries I don't know if it will reach full charge any sooner, but some people reported faster charging.
cgi said:
Your charger is a voltage source. That means that it will try to keep the voltage at the specified 5V. How much current comes out depends on the resistance of the circuit.
The 700mA is the maximum current you can take out of it before the voltage begins to drop.
There are quite some people who use stronger chargers, and thinking of the battery there should not be any problems until at least 1500mA (which would be 1C charging rate for the battery).
Considering the charging characteristics of LiIon batteries I don't know if it will reach full charge any sooner, but some people reported faster charging.
Click to expand...
Click to collapse
Thnks! I didnt know about this.
Shouldnt be a problem unless voltages are different, i'm charging mine with a nokia 5800 cable connected to a USB port in the keyboard. Slow as heck but it charges.
The formula was :
W = V x I x T
W refers to power created or usage.
V refers to voltage should be constant at DC 5 volt, for ac should 110 v or 220 v.
I refers to flow. The flow in and out.
T refers to time or duration.
As u can see bigger flow will increase the result coz multiplied by time.
So on my opinion, it will not damage ur device as long the voltage stays at 5 volt. And it will charge faster.
But remember one thing. Don't charge too long, coz more time will create more power which ur device can't stand. For ex. Dont Charge overnight.
CMIIW
Sent from my Nexus S I9020T
rejanmanis said:
The formula was :
W = V x I x T
W refers to power created or usage.
V refers to voltage should be constant at DC 5 volt, for ac should 110 v or 220 v.
I refers to flow. The flow in and out.
T refers to time or duration.
As u can see bigger flow will increase the result coz multiplied by time.
Click to expand...
Click to collapse
You are oversimplifying. What you are saying is true only for an ideal ohmic load resistor. Your phone is anything but that. In our case the current I is dependent on the time (among others) and a non-trivial function.
So actually the power used/transmitted is
W = integral( V * I(t) * dT )
assuming that V is constant. I depends on the charging circuit, the charger, and how full the battery is at the moment. If you hit the current limit of the charger, then even V is not constant anymore, but a (unknown) function of I.
So on my opinion, it will not damage ur device as long the voltage stays at 5 volt.
Click to expand...
Click to collapse
This is likely true, assuming, that the charging circuit in the phone limits the current to a level safe for the whole power train. That would not only be the battery itself, but also the cable, the connector, the traces on the circuit board, the charging transistor, the inductor (assuming a switched mode charging, not sure whether it is) and so on.
And it will charge faster.
Click to expand...
Click to collapse
That depends on the charging circuit and the battery protection circuit.
But remember one thing. Don't charge too long, coz more time will create more power which ur device can't stand. For ex. Dont Charge overnight.
Click to expand...
Click to collapse
You are still stuck in the ages of constant current, dumb lead-acid battery chargers. Nowadays we have "intelligent" chargers that monitor the state of the battery to ensure that doesn't go up in smoke by overcharging or to high charge rates.
Any sane design is current limited by the on-board charging circuit. I don't see why Samsung should have built anything else.
I usually let it charge over night. However I found today it is charged very fast: less than 2hours from 30% to 100%. I used galaxy note 2 usb charger. Do you have the same experience of charging or my battery has problems?
yes i same with you
When your battery was at 30%, it means the charger had to charge 1610mA (30% of 2300mA is 690mA). The Galaxy Note 2 charger has an output amperage of 2000mA. So you can imagine it won't take very long.
[update] Hm I misread note for tab. I have a tab 2 with a 2A charger. Not sure what the note 2 charger can output, but I'm guessing it will be above average.
Petrovski80 said:
When your battery was at 30%, it means the charger had to charge 1610mA (30% of 2300mA is 690mA). The Galaxy Note 2 charger has an output amperage of 2000mA. So you can imagine it won't take very long.
[update] Hm I misread note for tab. I have a tab 2 with a 2A charger. Not sure what the note 2 charger can output, but I'm guessing it will be above average.
Click to expand...
Click to collapse
Thanks. I will check the output of note 2 usb charger and do the math.
Stock Nexus 5 charger also charges it from 0 to 100% in less than 2 hours.
Dont forget that the devices kernel determines how much mA is drawn from a charger and not how much may a charger is rated for
-----------------------
Sent via tapatalk.
I do NOT reply to support queries over PM. Please keep support queries to the Q&A section, so that others may benefit
Not all milliamps are the same
It seems to be a common misconception that the number of milliamp-hours of your battery and the milliamp rating of your charger have a fixed relationship.They don't. It does not automatically follow that a 2000mAh battery will take 2 hours to charge from a 1000mA charger, or that the charge current will be 1000mA. Charge current can easily - and safely - be higher than the mA rating of the charger. Or lower.
The N5 battery is rated at 3.8V 2300mAh (typical) and, crucially, 8.74 watt hours. A 5V 1000mA charger can supply a maximum of 5 watts (5 volts x 1 amp). Voltage converters within the N5 change this 5 watts of power from 5V to 3.8V to suit the battery - and this could be at about 1250mA (assuming a not-unreasonable 95% conversion efficiency).
The battery voltage varies with the state of charge, reaching about 4.2V when fully charged. Even then, the charge current could be as high as 1130mA without drawing more than 1000mA from the 5V charger.
An earlier poster pointed out that charging is under control of the CPU (I suspect instead a dedicated charging circuit but that's irrelevant) and it is very likely that a) the charging current varies significantly during the charging cycle and b) it is unlikely that the charging circuit demands precisely the maximum that the charger can supply. But it is quite likely that the actual current being put into the battery is numerically higher than that being drawn from the source. It's the power in watts that counts, not the number of milliamps.
Batteries are not perfect, meaning you don't get out all you put in. If the battery was completely flat you would have to put in more than 8.74wh in to bring it up to full charge (although a totally flat Li-ion battery is dead beyond redemption; the battery life shown on the screen is the useable life, not ultimate power capacity).
Sometimes the charger rating, battery capacity and charge time seem to line up, but that's more due to a happy accident than anything else. A 40,000mA charger won't juice your phone from flat in four minutes!
Batteries, and charging, are complex...
G1MFG said:
It seems to be a common misconception that the number of milliamp-hours of your battery and the milliamp rating of your charger have a fixed relationship.They don't. It does not automatically follow that a 2000mAh battery will take 2 hours to charge from a 1000mA charger, or that the charge current will be 1000mA. Charge current can easily - and safely - be higher than the mA rating of the charger. Or lower.
The N5 battery is rated at 3.8V 2300mAh (typical) and, crucially, 8.74 watt hours. A 5V 1000mA charger can supply a maximum of 5 watts (5 volts x 1 amp). Voltage converters within the N5 change this 5 watts of power from 5V to 3.8V to suit the battery - and this could be at about 1250mA (assuming a not-unreasonable 95% conversion efficiency).
The battery voltage varies with the state of charge, reaching about 4.2V when fully charged. Even then, the charge current could be as high as 1130mA without drawing more than 1000mA from the 5V charger.
An earlier poster pointed out that charging is under control of the CPU (I suspect instead a dedicated charging circuit but that's irrelevant) and it is very likely that a) the charging current varies significantly during the charging cycle and b) it is unlikely that the charging circuit demands precisely the maximum that the charger can supply. But it is quite likely that the actual current being put into the battery is numerically higher than that being drawn from the source. It's the power in watts that counts, not the number of milliamps.
Batteries are not perfect, meaning you don't get out all you put in. If the battery was completely flat you would have to put in more than 8.74wh in to bring it up to full charge (although a totally flat Li-ion battery is dead beyond redemption; the battery life shown on the screen is the useable life, not ultimate power capacity).
Sometimes the charger rating, battery capacity and charge time seem to line up, but that's more due to a happy accident than anything else. A 40,000mA charger won't juice your phone from flat in four minutes!
Batteries, and charging, are complex...
Click to expand...
Click to collapse
This. Well said.
Your suspicions are correct, it does have a dedicated charging circuit. This chip is responsible for charging. Input current appears to be capped at 1200mA. Measured with my DMM last night and never saw the phone draw more than 960mA when charging with the screen off. It stayed like that until the battery was around 95% charged, then gradually tapered off from there as the battery reached 100%.
G1MFG said:
It seems to be a common misconception that the number of milliamp-hours of your battery and the milliamp rating of your charger have a fixed relationship.They don't. It does not automatically follow that a 2000mAh battery will take 2 hours to charge from a 1000mA charger, or that the charge current will be 1000mA. Charge current can easily - and safely - be higher than the mA rating of the charger. Or lower.
The N5 battery is rated at 3.8V 2300mAh (typical) and, crucially, 8.74 watt hours. A 5V 1000mA charger can supply a maximum of 5 watts (5 volts x 1 amp). Voltage converters within the N5 change this 5 watts of power from 5V to 3.8V to suit the battery - and this could be at about 1250mA (assuming a not-unreasonable 95% conversion efficiency).
The battery voltage varies with the state of charge, reaching about 4.2V when fully charged. Even then, the charge current could be as high as 1130mA without drawing more than 1000mA from the 5V charger.
An earlier poster pointed out that charging is under control of the CPU (I suspect instead a dedicated charging circuit but that's irrelevant) and it is very likely that a) the charging current varies significantly during the charging cycle and b) it is unlikely that the charging circuit demands precisely the maximum that the charger can supply. But it is quite likely that the actual current being put into the battery is numerically higher than that being drawn from the source. It's the power in watts that counts, not the number of milliamps.
Batteries are not perfect, meaning you don't get out all you put in. If the battery was completely flat you would have to put in more than 8.74wh in to bring it up to full charge (although a totally flat Li-ion battery is dead beyond redemption; the battery life shown on the screen is the useable life, not ultimate power capacity).
Sometimes the charger rating, battery capacity and charge time seem to line up, but that's more due to a happy accident than anything else. A 40,000mA charger won't juice your phone from flat in four minutes!
Batteries, and charging, are complex...
Click to expand...
Click to collapse
Thanks a lot. It did look complicated. As long as the fast charging is normal, I don't worry too much.
Can anyone recommend an app that shows real time current draw? It would also be cool if the app showed how much power the phone is using in real time.
Sent from my Nexus 5 using Tapatalk
G1MFG said:
It seems to be a common misconception that the number of milliamp-hours of your battery and the milliamp rating of your charger have a fixed relationship.They don't. It does not automatically follow that a 2000mAh battery will take 2 hours to charge from a 1000mA charger, or that the charge current will be 1000mA. Charge current can easily - and safely - be higher than the mA rating of the charger. Or lower.
The N5 battery is rated at 3.8V 2300mAh (typical) and, crucially, 8.74 watt hours. A 5V 1000mA charger can supply a maximum of 5 watts (5 volts x 1 amp). Voltage converters within the N5 change this 5 watts of power from 5V to 3.8V to suit the battery - and this could be at about 1250mA (assuming a not-unreasonable 95% conversion efficiency).
The battery voltage varies with the state of charge, reaching about 4.2V when fully charged. Even then, the charge current could be as high as 1130mA without drawing more than 1000mA from the 5V charger.
An earlier poster pointed out that charging is under control of the CPU (I suspect instead a dedicated charging circuit but that's irrelevant) and it is very likely that a) the charging current varies significantly during the charging cycle and b) it is unlikely that the charging circuit demands precisely the maximum that the charger can supply. But it is quite likely that the actual current being put into the battery is numerically higher than that being drawn from the source. It's the power in watts that counts, not the number of milliamps.
Batteries are not perfect, meaning you don't get out all you put in. If the battery was completely flat you would have to put in more than 8.74wh in to bring it up to full charge (although a totally flat Li-ion battery is dead beyond redemption; the battery life shown on the screen is the useable life, not ultimate power capacity).
Sometimes the charger rating, battery capacity and charge time seem to line up, but that's more due to a happy accident than anything else. A 40,000mA charger won't juice your phone from flat in four minutes!
Batteries, and charging, are complex...
Click to expand...
Click to collapse
True. I never said there was a fixed relationship though. They do have a loose relationship. Charging with a 500mA charger will take longer than charging with a 2000mA one, since about every modern phone accepts a charging limit higher than 500mA.
Another aspect not addressed in my reply is that the charge process isn't linear. But without going into too much electronics, I just wanted to explain to the OP he shouldn't have to worry if he notices differences in charging times when using chargers of different amperage output.
Today's batteries are much improved
wolfca said:
Thanks a lot. It did look complicated. As long as the fast charging is normal, I don't worry too much.
Click to expand...
Click to collapse
That's the ticket. When used with the correct charger, a modern phone battery takes a couple of hours to charge fully, a bit longer with a lower-rated charger. Or you can top up a bit if you have a few minutes spare. It's much better than the early mobiles with Ni-Cd batteries that took overnight to charge. And required weightlifting training before you could even pick them up!
So I'm looking to purchase a slew of new chargers for myself and my wife for at home, office in the car, etc... She has a Galaxy S4 and Ipad3; I have Galaxy S3 and Galaxy Note 10.1 2014. I'm at least somewhat aware of the charger requirements and plan to supply at least 2 amps to each device in general.
My real confusion is in the USB cable length. It seems most stock cables are roughly 3ft long and from what I can tell, going with a longer cable increases resistance, causes a voltage drop and lowers the current/charge rate. Of course a higher gauge cable will help decrease this resistance, but I'm already planning ot go with 24AWG cables.
So my question is, how long of a USB cable can I use before it has a noticable effect on my charge rate? If anything over 3ft significantly slows it down, then I'll probably just use AC extension cables instead of longer USB cables when necessary. What cable length vs charge rate do you find acceptable?
Thanks in advance!
I have a 10 foot cord that I got from mobstub and I noticed a significant increase in the time it takes to charge. I'm not sure how much degraded it is but if there's an app or something that can show me that I'll tell you the stats.
Sent from my SPH-L720 using xda premium
low voltage / low current
I don't have any empirical data to back it up, but at 5VDC & 2A or less (typically), I would not expect much of a drop in charge voltage/current due to length (10 ft or less). That's just a gut feel from building lots of cables & electronic assemblies for the fun of it !!
Good luck & have fun !!
Mark J Culross
KD5RXT
mjculross said:
I don't have any empirical data to back it up, but at 5VDC & 2A or less (typically), I would not expect much of a drop in charge voltage/current due to length (10 ft or less). That's just a gut feel from building lots of cables & electronic assemblies for the fun of it !!
KD5RXT
Click to expand...
Click to collapse
If your cables are thin - don't even expect to deliver 2000mA on them. The thickness is the key here.
bmather9 said:
So I'm looking to purchase a slew of new chargers for myself and my wife for at home, office in the car, etc... She has a Galaxy S4 and Ipad3; I have Galaxy S3 and Galaxy Note 10.1 2014. I'm at least somewhat aware of the charger requirements and plan to supply at least 2 amps to each device in general.
My real confusion is in the USB cable length. It seems most stock cables are roughly 3ft long and from what I can tell, going with a longer cable increases resistance, causes a voltage drop and lowers the current/charge rate. Of course a higher gauge cable will help decrease this resistance, but I'm already planning ot go with 24AWG cables.
So my question is, how long of a USB cable can I use before it has a noticable effect on my charge rate? If anything over 3ft significantly slows it down, then I'll probably just use AC extension cables instead of longer USB cables when necessary. What cable length vs charge rate do you find acceptable?
Thanks in advance!
Click to expand...
Click to collapse
At the voltages, gauges, and distances we are talking here, you are way over thinking this. It will certainly be fine at least up until the USB data transmission cable limit of 5 meters.
http://forum.xda-developers.com/showthread.php?t=2545497
According to the results from this other thread the wire length really seems to make a drastic difference.
I've heard quite a few people say that voltage drop will be minimal, but also heard of people being unable to charge their ipad 3 with a 10 ft cable.
I'd really like to use longer cables since they are generally more convenient to use the phone while plugged in, but the charge rate is also very important. So I'd like to get a feel as to what I'd be sacrificing by using longer cables.
I'm certainly overthinking this; I'm an engineer...that's what I do
Regardless, I'm planning to purchase quite a few cables and figured I should do so with some intelligence.
So with 2 votes so far for "1.5 ft or less" does that mean that people are really using even shorter cables to get better charge rates?
wire size vs voltage drop
OK, in place of my "gut feel" in my earlier post, here's just one website that allows you enter the parameters of your cable (size in AWG gauge, voltage, length & current) & then calculates the theoretical voltage drop through your cable: http://www.powerstream.com/Wire_Size.htm (scroll down to the bottom to find the calculator). For example, according to their calculations, a three foot cable using 24 gauge wires carrying 2A current would impart a little over a 0.3VDC drop. If your charger is supplying 5VDC at the source, then only 4.7VDC would make it to your phone for use in charging at the full 2A rate. Contrast this with a ten foot cable of the same size under the same conditions which suffers a little more than a 1VDC drop, resulting in only 4VDC being available at your phone at the full 2A rate for charging.
However, I would not expect the phone to continue to try to draw 2A of current under these conditions (particularly the 10 foot cable), else charging may not take place at all if/when the voltage is too low. Instead, I would expect that the charging circuit on the phone would diminish its current draw (to something LESS than 2A) in an attempt to keep the voltage closer to the desired 5VDC (or whatever the spec'd minimum is to charge the specific battery, assuming that the charger itself is putting out a nearly constant amount of power, somewhere near to its rated number of watts).
It's very likely because of this reduction in current that your overall charging rate is reduced (or to put it another way, your overall charging time is increased) on lesser size cables, etc.
YMMV . . .
Good luck & have fun !!
Mark J Culross
KD5RXT
mjculross said:
OK, in place of my "gut feel" in my earlier post, here's just one website that allows you enter the parameters of your cable (size in AWG gauge, voltage, length & current) & then calculates the theoretical voltage drop through your cable: (scroll down to the bottom to find the calculator). For example, according to their calculations, a three foot cable using 24 gauge wires carrying 2A current would impart a little over a 0.3VDC drop. If your charger is supplying 5VDC at the source, then only 4.7VDC would make it to your phone for use in charging at the full 2A rate. Contrast this with a ten foot cable of the same size under the same conditions which suffers a little more than a 1VDC drop, resulting in only 4VDC being available at your phone at the full 2A rate for charging.
However, I would not expect the phone to continue to try to draw 2A of current under these conditions (particularly the 10 foot cable), else charging may not take place at all if/when the voltage is too low. Instead, I would expect that the charging circuit on the phone would diminish its current draw (to something LESS than 2A) in an attempt to keep the voltage closer to the desired 5VDC (or whatever the spec'd minimum is to charge the specific battery, assuming that the charger itself is putting out a nearly constant amount of power, somewhere near to its rated number of watts).
It's very likely because of this reduction in current that your overall charging rate is reduced (or to put it another way, your overall charging time is increased) on lesser size cables, etc.
YMMV . . .
Good luck & have fun !!
Mark J Culross
KD5RXT
Click to expand...
Click to collapse
Thanks for that explanation. It seems that even a 6 ft USB cable will significantly slow charging, and that a 10 ft even more so to the point that it may not even charge sometimes. So its looking like 3ft USB cables with AC extensions where necessary is the way to go. Maybe I'll try some 1.5 ft as well, but not sure how practical they will be for using the devices while plugged in, even with the AC extension.
If anyone has another opinion please voice it.
First of all . Every night when I go to bed, I like to plug my phone in and charge it while I sleep.
6-7 hrs or so.
Is there a way to stop this fast charge feature , or turn it off. It's a cool addition, however I feel I'm doing more harm leaving it plugged in all night??
Could I just use my old blackberry charger block instead?
markdexter said:
First of all . Every night when I go to bed, I like to plug my phone in and charge it while I sleep.
6-7 hrs or so.
Is there a way to stop this fast charge feature , or turn it off. It's a cool addition, however I feel I'm doing more harm leaving it plugged in all night??
Could I just use my old blackberry charger block instead?
Click to expand...
Click to collapse
Technically , yes you can use any charger you'd like. You don't have to stick with the fast charger. I will though , highlight that the fast charger is optimized for the S6 battery and the battery is optimized for it , so there is no harm in keeping the charger plugged in.
You can't damage the battery if you leave it on all night. All phones have special circuitry to stop charging once the battery is full.
Fast charge works by increasing the voltage, not the current. This is a much safer approach for increasing the power transfer from charger to phone. As it is primarily the amperage that increases thermal output etc.
Although the phone will get warm initially while charging, all phones will. But when the battery reaches full capacity, the battery circuit actually says "okie dokie. I've got what I need now. Let's just trickle charge to keep me full till my boss is ready" and hardly any power will flow through, and the temperature will drop.
Actually makes me wonder about setting up a temp/time monitor while charging to see exactly what happens and when now :3
But as others have said, both charger and battery are optimised for it, and it is plenty safe enough. It's what I do!
There is also nothing stopping you using any other (safe and preferably branded) 5v 1A charger. It will just charge slower. Much like what you are already used to. 3-4 hours instead of 1-1.5ish hours.
solitarymonkey said:
Fast charge works by increasing the voltage, not the current.
Click to expand...
Click to collapse
Forgive me, I dropped my electrical engineering major, but when we're dealing with direct current, doesn't increasing the voltage by definition increase the current if the resistance doesn't change?
I used the Ampere app to compare the regular charger, from which the phone pulled around a half of an amp, with the fast charger, from which the phone pulled a full amp.
Sallyty said:
Battery life depends on the number of repeated charge and discharge, so should avoid charging the battery is more than power, it will shorten the battery life.
Click to expand...
Click to collapse
I'm not really sure what you mean by " so should avoid charging the battery is more than power", but battery lifespan in lithium batteries is decreased by FULL discharges and recharges. The best possible routine for making lithium batteries last is to charge early and often. And as genetichazzard pointed out, there is circuitry included that stops the charging (or trickles it) once it reaches full charge.
"Rapid charging", in general, will cost you life in batteries, but that is usually in reference to 4A-6A rapid charging, where this new Samsung charger still does not exceed 2A. I trust their battery engineers. They've one of two things: they have either engineered the batteries and chargers to last in their first sealed body phone, or they are trying to screw us by making a battery/charging system that will force us to pay for a costly battery replacement. They won't stay in business much longer if they go the second route.
flu13 said:
Forgive me, I dropped my electrical engineering major, but when we're dealing with direct current, doesn't increasing the voltage by definition increase the current if the resistance doesn't change?
I used the Ampere app to compare the regular charger, from which the phone pulled around a half of an amp, with the fast charger, from which the phone pulled a full amp.
Click to expand...
Click to collapse
I have no idea of the complexities in the technology, or how the phone itself deals with the current from the charger. But I looked at the fast charger that came with my s6 last night, and it is rated like this;
9V 1.67A
5V 2A
The 9V output provides a 15W of power, whereas the the 5V output provides 10W of power.
And after a little bit of reading (can't site my source now as I forgot the website), it is the current that generates heat in the components (such as wires).
So by upping the voltage, the charger is able to transfer more energy to the phone safer than if manufacturers continued to just increase the current.
There will be a smart switching method of some sort within the charger to go from the 5V circuit to the 9V circuit, with a slightly higher resistance to drop the current.
And before I ramble on without making much sense, that is what I have learnt
---------- Post added at 02:05 PM ---------- Previous post was at 01:51 PM ----------
Sallyty said:
I think you are right,Maybe phones have special circuitry to stop charging once the battery is full.
Click to expand...
Click to collapse
You are right. Chargers and rechargeable batteries as a whole have been getting "more intelligent" over recent years.
No idea exactly how they do it, I know that a lot of batteries have chips in the that monitor things such as charge capacity and its "health". So I am assuming that they have some form of circuit switch to a higher resistance circuit when the battery is full, so that only a very tiny current can flow, keeping the battery full, without killing it.
solitarymonkey said:
No idea exactly how they do it, I know that a lot of batteries have chips in the that monitor things such as charge capacity and its "health". So I am assuming that they have some form of circuit switch to a higher resistance circuit when the battery is full, so that only a very tiny current can flow, keeping the battery full, without killing it.
Click to expand...
Click to collapse
Almost always, the circuit is built into the charging device, not the battery. In the case of phone batteries, the phone is the charging device.
Link to more than any non battery engineer needs to know about lithium-ion batteries and charging.
DevonSloan said:
Almost always, the circuit is built into the charging device, not the battery. In the case of phone batteries, the phone is the charging device.
Link to more than any non battery engineer needs to know about lithium-ion batteries and charging.
Click to expand...
Click to collapse
Thanks for the link/info. A while after I said all that, I started thinking that it can't be right.
The phone does the regulation, but I'm pretty they (the batteries) do have an integrated chip for health stuff.
Cheers again for the correction!