[Q] USB Cable Length and Charge Rate? - General Questions and Answers

So I'm looking to purchase a slew of new chargers for myself and my wife for at home, office in the car, etc... She has a Galaxy S4 and Ipad3; I have Galaxy S3 and Galaxy Note 10.1 2014. I'm at least somewhat aware of the charger requirements and plan to supply at least 2 amps to each device in general.
My real confusion is in the USB cable length. It seems most stock cables are roughly 3ft long and from what I can tell, going with a longer cable increases resistance, causes a voltage drop and lowers the current/charge rate. Of course a higher gauge cable will help decrease this resistance, but I'm already planning ot go with 24AWG cables.
So my question is, how long of a USB cable can I use before it has a noticable effect on my charge rate? If anything over 3ft significantly slows it down, then I'll probably just use AC extension cables instead of longer USB cables when necessary. What cable length vs charge rate do you find acceptable?
Thanks in advance!

I have a 10 foot cord that I got from mobstub and I noticed a significant increase in the time it takes to charge. I'm not sure how much degraded it is but if there's an app or something that can show me that I'll tell you the stats.
Sent from my SPH-L720 using xda premium

low voltage / low current
I don't have any empirical data to back it up, but at 5VDC & 2A or less (typically), I would not expect much of a drop in charge voltage/current due to length (10 ft or less). That's just a gut feel from building lots of cables & electronic assemblies for the fun of it !!
Good luck & have fun !!
Mark J Culross
KD5RXT

mjculross said:
I don't have any empirical data to back it up, but at 5VDC & 2A or less (typically), I would not expect much of a drop in charge voltage/current due to length (10 ft or less). That's just a gut feel from building lots of cables & electronic assemblies for the fun of it !!
KD5RXT
Click to expand...
Click to collapse
If your cables are thin - don't even expect to deliver 2000mA on them. The thickness is the key here.

bmather9 said:
So I'm looking to purchase a slew of new chargers for myself and my wife for at home, office in the car, etc... She has a Galaxy S4 and Ipad3; I have Galaxy S3 and Galaxy Note 10.1 2014. I'm at least somewhat aware of the charger requirements and plan to supply at least 2 amps to each device in general.
My real confusion is in the USB cable length. It seems most stock cables are roughly 3ft long and from what I can tell, going with a longer cable increases resistance, causes a voltage drop and lowers the current/charge rate. Of course a higher gauge cable will help decrease this resistance, but I'm already planning ot go with 24AWG cables.
So my question is, how long of a USB cable can I use before it has a noticable effect on my charge rate? If anything over 3ft significantly slows it down, then I'll probably just use AC extension cables instead of longer USB cables when necessary. What cable length vs charge rate do you find acceptable?
Thanks in advance!
Click to expand...
Click to collapse
At the voltages, gauges, and distances we are talking here, you are way over thinking this. It will certainly be fine at least up until the USB data transmission cable limit of 5 meters.

http://forum.xda-developers.com/showthread.php?t=2545497
According to the results from this other thread the wire length really seems to make a drastic difference.
I've heard quite a few people say that voltage drop will be minimal, but also heard of people being unable to charge their ipad 3 with a 10 ft cable.
I'd really like to use longer cables since they are generally more convenient to use the phone while plugged in, but the charge rate is also very important. So I'd like to get a feel as to what I'd be sacrificing by using longer cables.
I'm certainly overthinking this; I'm an engineer...that's what I do
Regardless, I'm planning to purchase quite a few cables and figured I should do so with some intelligence.

So with 2 votes so far for "1.5 ft or less" does that mean that people are really using even shorter cables to get better charge rates?

wire size vs voltage drop
OK, in place of my "gut feel" in my earlier post, here's just one website that allows you enter the parameters of your cable (size in AWG gauge, voltage, length & current) & then calculates the theoretical voltage drop through your cable: http://www.powerstream.com/Wire_Size.htm (scroll down to the bottom to find the calculator). For example, according to their calculations, a three foot cable using 24 gauge wires carrying 2A current would impart a little over a 0.3VDC drop. If your charger is supplying 5VDC at the source, then only 4.7VDC would make it to your phone for use in charging at the full 2A rate. Contrast this with a ten foot cable of the same size under the same conditions which suffers a little more than a 1VDC drop, resulting in only 4VDC being available at your phone at the full 2A rate for charging.
However, I would not expect the phone to continue to try to draw 2A of current under these conditions (particularly the 10 foot cable), else charging may not take place at all if/when the voltage is too low. Instead, I would expect that the charging circuit on the phone would diminish its current draw (to something LESS than 2A) in an attempt to keep the voltage closer to the desired 5VDC (or whatever the spec'd minimum is to charge the specific battery, assuming that the charger itself is putting out a nearly constant amount of power, somewhere near to its rated number of watts).
It's very likely because of this reduction in current that your overall charging rate is reduced (or to put it another way, your overall charging time is increased) on lesser size cables, etc.
YMMV . . .
Good luck & have fun !!
Mark J Culross
KD5RXT

mjculross said:
OK, in place of my "gut feel" in my earlier post, here's just one website that allows you enter the parameters of your cable (size in AWG gauge, voltage, length & current) & then calculates the theoretical voltage drop through your cable: (scroll down to the bottom to find the calculator). For example, according to their calculations, a three foot cable using 24 gauge wires carrying 2A current would impart a little over a 0.3VDC drop. If your charger is supplying 5VDC at the source, then only 4.7VDC would make it to your phone for use in charging at the full 2A rate. Contrast this with a ten foot cable of the same size under the same conditions which suffers a little more than a 1VDC drop, resulting in only 4VDC being available at your phone at the full 2A rate for charging.
However, I would not expect the phone to continue to try to draw 2A of current under these conditions (particularly the 10 foot cable), else charging may not take place at all if/when the voltage is too low. Instead, I would expect that the charging circuit on the phone would diminish its current draw (to something LESS than 2A) in an attempt to keep the voltage closer to the desired 5VDC (or whatever the spec'd minimum is to charge the specific battery, assuming that the charger itself is putting out a nearly constant amount of power, somewhere near to its rated number of watts).
It's very likely because of this reduction in current that your overall charging rate is reduced (or to put it another way, your overall charging time is increased) on lesser size cables, etc.
YMMV . . .
Good luck & have fun !!
Mark J Culross
KD5RXT
Click to expand...
Click to collapse
Thanks for that explanation. It seems that even a 6 ft USB cable will significantly slow charging, and that a 10 ft even more so to the point that it may not even charge sometimes. So its looking like 3ft USB cables with AC extensions where necessary is the way to go. Maybe I'll try some 1.5 ft as well, but not sure how practical they will be for using the devices while plugged in, even with the AC extension.
If anyone has another opinion please voice it.

Related

Use iPhone charger for Nexus One?

Hi guys,
since the charger of the iPhone has an USB connector, my question is if I can use that charger for the Nexus One too??
Since it supplies actually the same energy level (both can get energy from a PC USB instead), does this mean that also the charger delivers current with good specifications for the N1?
thanx in advance
exelero
How are you going to plug the iPhone 30 pin in to a micro USB? Maybe I'm not understanding you correctly.
to plug the microUSB-to-USB cable of the Nexus One into the USB socket of the iPhone charger.
moto1907 said:
How are you going to plug the iPhone 30 pin in to a micro USB? Maybe I'm not understanding you correctly.
Click to expand...
Click to collapse
I think (s)he means the mains adapter part with the USB socket in it - in which case yes it will work, you just need the right cable - which came with your N1
Looking forward to when all manufacturers standardised on micro usb, for a while I thought that mini usb was micro, so I was surprised to see that the n1 was different then I found out....
ok, now i see. Not enough coffee yet,lol
scote said:
in which case yes it will work
Click to expand...
Click to collapse
yeah, from the connection part, I know it will work. my question was more related if the iPhone charger delivers the appropriate current (tension/amperage) conditions for the N1 and, let's say, it won't burn the N1 due to a too high voltage, or mess up the battery due to an incorrect tension input.
I was just having a conversation on chargers with a friend of mine the other day and googled around on the subject on amperage/current/Ma of chargers..
I stumbled upon these posts (either here on XDA, Androidforums or other forums):
1. "The current rating on a voltage source is the maximum amount that the power
source can deliver without exceeding its saftey rating.
What this means is that if you are using some device that has a power supply
with a current rating of 500mA then its best not to use a different power
supply(at the same votlage rating) with a lower max current rating. i.e.,
anything < 500mA. Now ofcourse you might be able to get away with it but if
it burns down your house then its your fault.
A device will only pull the amount of current that it uses(assuming it is a
voltage controlled device) and this is true regardless of the current
rating(hence the saftey issues I discussed above). If a device
says(sometimes they don't) it uses 500mA then it uses 500mA. Maybe it
doesn't use 500mA all the time but the engineers have put that rating there
for a reason. Using any power supply with the right voltage and a current
rating of anything more than what the device uses is ok because the device
will only pull the current it uses.
Now, about the voltage rating: The voltage rating does not have to be exact
and different devices can tolerate different voltage ratings. The problem
usually is one of current. By increasing the voltage, say, you increase the
current the device uses and then you have changed the parameters that the
device was created with."
2. "And as far plugging your phone into a charger that outputs well over 850mA, don't worry about that either. Unlike voltage, the more amperage the merrier because the device will only take what it needs of the available resources."
3. "Moral of the story. Match the Voltage (5.1Volts) Meet or Exceed the 850mA rating. (which is .850 Amps) and you'll be fine."
4. "amps are not pushed but drawn
amps is the max the charger can provide
before it get pressured and lover the volts
you could use a 5volt 10000MegaAmp charger
and the device would only draw the amps the device
was made to draw all the rest of the amps would stay
at your electricity company
ohms law state Amps == volts / residence"
5. "amps are not pushed but drawn
ohms law state Amps == volts / residence
In other English:
P = VI, where
P = Power of device (watts) and is fixed
V = Voltage used by device (volts) and is fixed
I = Current (amps) and is decided by P/V (a fixed ratio)
So the device cannot draw more current than the fixed ratio. It may draw less current if the charger cannot supply the highest amount, but then as in one of the above posts, it simply takes longer to recharge.
With these devices, milliWatts/miliAmps are the scale, 5V is generally the fixed Potential Difference.
Used in a vehicle, the device is generally both drawing and expending energy (ie. charging and running say, GPS) simultaneously. This in/out situation when prolonged is the cause of the observed overheating with the original X1 battery."
Click to expand...
Click to collapse
Bottom line... Make sure the voltage is 5V, the amperage doesn't really matter.
thanx for your feedback.
both chargers have the same output (5V, 1A) - so I am already successfully using it.
blackberry storm also has micro-usb
hello two chargers and an extra usb-cable
To answer the OP's question...
yes it works fine. I have one in my car and charge my ipod and nexus at the same time without issue. The advanced zune car adapter also works (in case one of the 5 who own a zune are reading this ).
I have been using my old Iphone white plug -> USB adapter for almost a month now and I haven't had any issues.
I tried using the iPhone charger but it charges slow...it does not quick charge. Am I doing something wrong?
uhohhotdog said:
I tried using the iPhone charger but it charges slow...it does not quick charge. Am I doing something wrong?
Click to expand...
Click to collapse
no. If i'm up to date, and i think i am, you'll only get the 'fast charge' if you use the charger that came with your phone. the usb to usb micro chargers all charge slow
Can i use X10 mini charger (5v DC 850mA output) on nexus one by the things i read i understand that i can't but my english is not very good. Thanks
if youre talking about using the iphone usb cable on your n1, HAHAAHHAAHAHAHAH
if you mean using the usb-to-AC adaptor then yea you can use any usb cable to charge anything pretty much..

Charger Compatibility - 550mA vs. 700mA

I have older chargers for that output 5.0v / 550mA and have noticed that the Captivate outputs 5.0v / 700mA.
1. Can I safely use the older, 550mA chargers with the Captivate and what will the effect be?
2. Can I safely use the Captivate's 700mA charger with the older phones and what will the effect be?
Thanks.
1. Yes, you can use it. But it will charge slower.
2. Yes, you can use it. I believe that just because the charger output is higher amperage doesn't mean that it will affect the phone adversely. Think of it this way: A lamp is plugged into the wall outlet at your house. That outlet is rated at 120v 15A. The bulb isn't using all 15 amps, so no problem. But if you were to turn the voltage up or down, the lamp will get brighter or dimmer respectively.
The Captivate can take up to a 1A (1000 mA) charger.
Truceda said:
1. Yes, you can use it. But it will charge slower.
2. Yes, you can use it. I believe that just because the charger output is higher amperage doesn't mean that it will affect the phone adversely. Think of it this way: A lamp is plugged into the wall outlet at your house. That outlet is rated at 120v 15A. The bulb isn't using all 15 amps, so no problem. But if you were to turn the voltage up or down, the lamp will get brighter or dimmer respectively.
The Captivate can take up to a 1A (1000 mA) charger.
Click to expand...
Click to collapse
Another question. You sound smart on this so how about using the nexus one car charger on this? Fits perfect and I don't see why not, but still Leary. .. what do you think?
As long as it puts out 5v, (which is USB standard), you are fine. The amperage is only relevant if the device REQUIRES it. For instance if the device draws 1A and your charger could only handle 550mA your device will charge very slowly.
On the other hand if your charger can handle 1.2A and your device only draws 700mA, then your charger will only output 700mA.
The important thing is the voltage, it needs to be 5v +/- 3% ...
I actually use a generic car charger I bought at walmart with 2 USB ports on it, and it works well for every USB powered device I own ... ZUNE, iPOD, phones etc.
OK. Thanks for the reply. I had read that a phone requiring 700mA that uses a 550mA charger could damage the charger and possibly the phone. That's what made me wonder. And that's what led to the question.
Let's make this a bit more interesting. There's a local, highly-reputable cell phone repair store that has stopped selling car chargers because, they say, the rapid charge is not good for the phone's battery. Their recommendation is to use an inverter (no, they don't sell them) so that you can then plug a standard wall charger into it or a USB cable if the inverter is so equipped. The AC current that results from utilizing the inverter is more consistent than the current flowing from a car charger. So...I purchased an inverter for less than $20 and use it to charge the Captivate in my car.
OK ... not sure we need to get this far down in the weeds on this but here goes ....
The USB2 standard for power distribution is 5v and the thresholds are 4.4-5.25V.
Power is supplied in units of 5v power ... 1 unit is 5v at 100mA, no device can draw more than 5 units from any one port. If you have ever seen a portable hard drive with 2 USB connectors it is because it requires more than 500mA to operate and by using 2 ports the device can draw up to 1A. For dedicated chargers the 4.4-5.25v still applies but shorting the D+/- and disabling the data connection allows the device to detect that it is connected to a dedicated charging port and draw a maximum of 1.8A.
In keeping with the above guidelines, when connected to your computer the Captivate can draw no more than 1 unit of power which is [email protected], when connected to a dedicated charger the phone can draw [email protected] and stay within the standard. (yes, it caps itself at 1A, I know).
OK ... the next bit is going to be hard to digest because there are plenty of examples to the contrary ... there is a standard for mobile USB chargers, and it requires wiring them as dedicated charging ports. What this means to us is that, in theory anyway, a mobile USB charger should allow a device to draw up to 1.8A from it (highly unlikely ... but that's the standard as written).
Here is the problem, if the device is plugged into a dedicated charging port and tries to draw it's maximum rated current, that amount of current may not always be available or it may fluctuate. This fluctuation is what causes problems. Have you ever turned you car stereo up real loud and seen your headlights dim in beat with the music? Same thing, the power system is being drawn down. There are a couple of ways to stabalize your power system ... install a large capacitor (mine is 2 Farad) to provide "conditioning", or go the transformer route.A tansformer provides conditioning, but only on its own outputs... while a large cap will condition the entire power system if installed correctly.
So yes, using a quick charger on your phone can cause issues if your car has a ****ty power system or a large stereo system which is not set up properly (again, ****ty power system). Make sure your charging device is within the standard, and you should be fine wether it is USB via a cigarette lighter port or a 110V transformer.
I appreciate your detailed, helpful reply.
Other than the important 5v parameter, what I've taken away from your information is that a car charger can be used in a vehicle with a power supply that is known to be stable, and that either a whole-car conditioning system or an inverter should be used on one with a, shall we say, "less than stable" power supply (PG version ).
Jack45 said:
I appreciate your detailed, helpful reply.
Other than the important 5v parameter, what I've taken away from your information is that a car charger can be used in a vehicle with a power supply that is known to be stable, and that either a whole-car conditioning system or an inverter should be used on one with a, shall we say, "less than stable" power supply (PG version ).
Click to expand...
Click to collapse
Pretty much ...
There's more than one way to skin a cat, and that Transformer will cost you a whole lot less than a 2 Farad capacitor. I hate cats, but they serve a purpose.
@Battlehymn - one more question for you
I have converted from iPhone to the captivate (no haters please) - finally found a physical form factor with specs i like and the captivate rocks.
Anyway, I have some extra external batteries I used with my iphone that i want to use with my Captivate. I just bought a Female iPod connector and I am planning to connect it to a micro USB connector - the pinout is straight forward, but here is my question:
Should I connect the D+/- (short them together)? That is my plan. My batteries are 1900 or 1000 mAH - I assume that even if the phone tries to the draw 1.8A, the batteries have a circuit to only discharge so fast.

Question about chargers

I have an HTC Explorer, which came with a USB cable and an adapter to allow me to charge via mains as well. (Something like this: http://i.expansys.com/i/g/g232723.jpg - that's not the exact same one, but you get the idea). This charger says on it that it has an output of 5V and 1A. I also have a Blackberry Playbook, which came with a charger with a micro USB connector. However, this charger says on it that it has an output of 5V and 2A.
I have tested the HTC charger with my playbook, and it worked, but the question I want to ask is this: would it be dangerous to try the playbook charger on my HTC phone? (2A output from Playbook charger versus the presumably expected 1A input on the phone) I would like to be able to only carry one charger when travelling and the playbook charger has interchangeable adapters for international plugs, so it would be better to take, but obviously I don't want to overload the phone and have it burst into flames or whatever.
Any advice on the matter would be great, thanks.
It's not recommended by the manufacturer. It can void your warranty, though I don't know how they'd ever find out you were doing so. There can also be issues when using the cable to transfer data and such.
I did watch a video about that Blackberry charger.
Supposedly because of that 2A output, it charges your phone twice as a fast.
Sent from my Ainol Novo7 Elf using xda premium
I don't think there would be considerable damages but..I'd just use them both, even if it's less comfortable
I looked around and I think it should be safe. From what I've read online, the phone will only take a certain amount of current, regardless of the current being made available by the charger. Because the voltage is the same (5V), and the resistance of the phone circuitry is constant, by Ohm's law, I = V/R, so I will always be the same as long as V is the same. Presumably it'll drop with a lower input current, but the max I will always be the same, and that'll be limited to a safe level.
It might take a slightly higher current (say if the max the phone can take is 1.2A or something, the 1A charger can only give 1A but the 2A would give the full 1.2A), but no higher than the circuits in the phone will allow. After reading this, I realised that it made a lot of sense, and I think it's right. I also read that if it does charge the phone faster (which it will if the phone is taking a higher current e.g. 1.2A) it'll reduce the number of charging cycles that you get out of the battery. But the consensus seems to be that doing it every now and again when travelling etc. should be fine.
Thanks for all your responses.
EDIT: I think at worst, I might damage the battery, and they're not overly expensive to replace I don't think.
I agree, the rating on the charger is what it's MAX output, not that it will push that much current to your device. If you have an extra USB cable (that you dont mind cutting apart) and a mulit-meter, you can check to see how much current your phone is pulling from the BB Playbook charger. connect your mult-meter in-line with the red wire in the usb cable (just connect all others together)
Just make sure your mult-meter is rated for at least 2A.
Hope this helps.
Devices with lithium batteries usually have a charger circuit that limit the peak current that is sent to the battery. They also utilize temperature compensation so if you are charging the battery too fast, it starts limiting the charge current.
Where you might have a problem is when it's plugged in and you are using it, especially if the battery is low. You get high charge currents, combined with the operating current.

Understanding AMPs and Charge Times

So, after doing a smidgen of research I found out that charge time relates directly to the amp output of the wall adapter. Now, on to my question...
I bought a 10 foot charger from a local 7-11 knowing it was probably not of any quality but thinking it would at least charge my phone. However, it didn't, my s4 would actually lose charge while connected. Now that I have a small understanding of the amp thing, I can see how the standard 2.0 amp Samsung charger would have trouble producing enough power to travel 10 feet AND THEN charge my s4.
My question is, if I were to get a higher rated adapter that's been tested at say 4-5 Amp output, would that produce enough power to travel 10 feet AND THEN charge my s4? And what variables would be included? Like with it being a cheaper USB Cord could the extra power burn the cord out? Are USB cords rated as for the power it can safely carry? Does the Galaxy s4 have any safety precautions in the event that TOO much power was coming through?
A charger with more than a 2.1A rating will not do anything for you. Or I should say, the Galaxy will not accept more than a 2.1A rating. Whether it helps "push" the current through the cable I dunno.
But for reference, any high quality cable longer than 6' is going to suffer from a lower charging rate.
Also, there are two types of cables. You have "charging cable" and "data cable". The data cable is not going to give you high charging rates. It's about 1/4 the charging speed actually.
Go to amazon.com and look up red label charging cables for some good product.
Ok... so physics won't change..just because you don't understand physics.
The longer the cable--the bigger the voltage drop and you'll see a lower current flow/charging rate. 3-5 feet is optimum..ever notice that is all OEM's give you.
7-11 cord is probably garbage.
Monoprice a few 3ft cords.
Also.. the phone DRAWS amps.. the charger does not PUSH amps.. You can hook up a 5amp charger, but if the phone was designed to only draw 1.8A or 2.0A, that is all it will draw.. you can't shove more current into it.
If you want a good second charger that does higher rates, Anker makes a good 40W charger than will work to charger iPads, S4's, iphones so it.. itll provide enough power for all those devices.

Can someone explain how USB phone charging works? Volts,Watts, AMPS, etc?

I need a lesson on how USB charging works and what Watts, Volts, and Amps are and how they are different.
Also, does the USB cable play a role in the charging process? As in, are all USB cables capable of transferring the same amount of power?
I have a LG G3 and want to buy a portable charger, but want to make sure I will buy one that is fast enough and will be supported.
Any help is appreciated.
Volts, amps, watts? Check out the Wiki, they're doing great.
Of course, the cable plays a role in the process, e-very-thing plays a role!
The original USB specification was not originally developed with high power delivery in mind, thence it all started with a measly 5V, something which can pose a real problem when delivering high currents because, as voltage drop is proportional to conductor resistance, nasty cheapass cables can cripple current flow badly (so little volts we can't allow to drop them much).
Wanna see an example of a “that sh1te rocks” cable? Check this out: No frills USB to MicroUSB Heavy-duty Cat-5e 5ft/1'5m cable (No one trashes Jersey city! Uuuh, but me. :laugh
Brief summary: for a cable gauge/section is your ally, lenght your enemy. Aiming at keeping the same voltage drop between the supply (charger) and the load (device), the wire section will have to be proportional to its lenght (equal conductor resistance).
Tips: a reduction of 3 points in the AWG scale is about equivalent to doubling the section; -6 points means around double diameter. For example, AWG22 wire has about 4 times the section of AWG28 (4 times the current delivery, lenght being equal of course). And that (AWG22) is about the minimum gauge required to guarantee 2'1A of current flow through 2 meters of cable while still having just some chances of meeting the USB2.0 specifications (±5% in voltage).
Regarding the PSU (yes, Power Supply Unit), buy some decent brand name one. These things charge nothing, they're just power supplies (I've only seen one “charger” with active load voltage compensation; an LG unit which I converted to microUSB and have somewhere lying around).
It's the power management circuitry of your device the master who owns your battery, not the c̶h̶a̶r̶g̶e̶r̶ supply. The battery receives the remaining energy available from that circuitry (up to the maximum charging current for which it is programmed to deliver) after feeding the guts of your device (and minus innefficiencies). That's the reason your phone barely charges if you're USB connected (500mA) while operating it, because only around 2'5W of power are being delivered (5V×500mA) and, after substracting SoC + screen consumption figures plus power conversion innefficiencies, you may still be lucky if your battery “gets some”.
Charger reviews: http://lygte-info.dk/
Don't know about your device but, if it supports Qualcomm QC protocols you may as well benefit from a QC2.0 compliant charger.
Hope this helps. :silly:
Cheers

Categories

Resources