Can someone explain how USB phone charging works? Volts,Watts, AMPS, etc? - General Questions and Answers

I need a lesson on how USB charging works and what Watts, Volts, and Amps are and how they are different.
Also, does the USB cable play a role in the charging process? As in, are all USB cables capable of transferring the same amount of power?
I have a LG G3 and want to buy a portable charger, but want to make sure I will buy one that is fast enough and will be supported.
Any help is appreciated.

Volts, amps, watts? Check out the Wiki, they're doing great.
Of course, the cable plays a role in the process, e-very-thing plays a role!
The original USB specification was not originally developed with high power delivery in mind, thence it all started with a measly 5V, something which can pose a real problem when delivering high currents because, as voltage drop is proportional to conductor resistance, nasty cheapass cables can cripple current flow badly (so little volts we can't allow to drop them much).
Wanna see an example of a “that sh1te rocks” cable? Check this out: No frills USB to MicroUSB Heavy-duty Cat-5e 5ft/1'5m cable (No one trashes Jersey city! Uuuh, but me. :laugh
Brief summary: for a cable gauge/section is your ally, lenght your enemy. Aiming at keeping the same voltage drop between the supply (charger) and the load (device), the wire section will have to be proportional to its lenght (equal conductor resistance).
Tips: a reduction of 3 points in the AWG scale is about equivalent to doubling the section; -6 points means around double diameter. For example, AWG22 wire has about 4 times the section of AWG28 (4 times the current delivery, lenght being equal of course). And that (AWG22) is about the minimum gauge required to guarantee 2'1A of current flow through 2 meters of cable while still having just some chances of meeting the USB2.0 specifications (±5% in voltage).
Regarding the PSU (yes, Power Supply Unit), buy some decent brand name one. These things charge nothing, they're just power supplies (I've only seen one “charger” with active load voltage compensation; an LG unit which I converted to microUSB and have somewhere lying around).
It's the power management circuitry of your device the master who owns your battery, not the c̶h̶a̶r̶g̶e̶r̶ supply. The battery receives the remaining energy available from that circuitry (up to the maximum charging current for which it is programmed to deliver) after feeding the guts of your device (and minus innefficiencies). That's the reason your phone barely charges if you're USB connected (500mA) while operating it, because only around 2'5W of power are being delivered (5V×500mA) and, after substracting SoC + screen consumption figures plus power conversion innefficiencies, you may still be lucky if your battery “gets some”.
Charger reviews: http://lygte-info.dk/
Don't know about your device but, if it supports Qualcomm QC protocols you may as well benefit from a QC2.0 compliant charger.
Hope this helps. :silly:
Cheers

Related

Use iPhone charger for Nexus One?

Hi guys,
since the charger of the iPhone has an USB connector, my question is if I can use that charger for the Nexus One too??
Since it supplies actually the same energy level (both can get energy from a PC USB instead), does this mean that also the charger delivers current with good specifications for the N1?
thanx in advance
exelero
How are you going to plug the iPhone 30 pin in to a micro USB? Maybe I'm not understanding you correctly.
to plug the microUSB-to-USB cable of the Nexus One into the USB socket of the iPhone charger.
moto1907 said:
How are you going to plug the iPhone 30 pin in to a micro USB? Maybe I'm not understanding you correctly.
Click to expand...
Click to collapse
I think (s)he means the mains adapter part with the USB socket in it - in which case yes it will work, you just need the right cable - which came with your N1
Looking forward to when all manufacturers standardised on micro usb, for a while I thought that mini usb was micro, so I was surprised to see that the n1 was different then I found out....
ok, now i see. Not enough coffee yet,lol
scote said:
in which case yes it will work
Click to expand...
Click to collapse
yeah, from the connection part, I know it will work. my question was more related if the iPhone charger delivers the appropriate current (tension/amperage) conditions for the N1 and, let's say, it won't burn the N1 due to a too high voltage, or mess up the battery due to an incorrect tension input.
I was just having a conversation on chargers with a friend of mine the other day and googled around on the subject on amperage/current/Ma of chargers..
I stumbled upon these posts (either here on XDA, Androidforums or other forums):
1. "The current rating on a voltage source is the maximum amount that the power
source can deliver without exceeding its saftey rating.
What this means is that if you are using some device that has a power supply
with a current rating of 500mA then its best not to use a different power
supply(at the same votlage rating) with a lower max current rating. i.e.,
anything < 500mA. Now ofcourse you might be able to get away with it but if
it burns down your house then its your fault.
A device will only pull the amount of current that it uses(assuming it is a
voltage controlled device) and this is true regardless of the current
rating(hence the saftey issues I discussed above). If a device
says(sometimes they don't) it uses 500mA then it uses 500mA. Maybe it
doesn't use 500mA all the time but the engineers have put that rating there
for a reason. Using any power supply with the right voltage and a current
rating of anything more than what the device uses is ok because the device
will only pull the current it uses.
Now, about the voltage rating: The voltage rating does not have to be exact
and different devices can tolerate different voltage ratings. The problem
usually is one of current. By increasing the voltage, say, you increase the
current the device uses and then you have changed the parameters that the
device was created with."
2. "And as far plugging your phone into a charger that outputs well over 850mA, don't worry about that either. Unlike voltage, the more amperage the merrier because the device will only take what it needs of the available resources."
3. "Moral of the story. Match the Voltage (5.1Volts) Meet or Exceed the 850mA rating. (which is .850 Amps) and you'll be fine."
4. "amps are not pushed but drawn
amps is the max the charger can provide
before it get pressured and lover the volts
you could use a 5volt 10000MegaAmp charger
and the device would only draw the amps the device
was made to draw all the rest of the amps would stay
at your electricity company
ohms law state Amps == volts / residence"
5. "amps are not pushed but drawn
ohms law state Amps == volts / residence
In other English:
P = VI, where
P = Power of device (watts) and is fixed
V = Voltage used by device (volts) and is fixed
I = Current (amps) and is decided by P/V (a fixed ratio)
So the device cannot draw more current than the fixed ratio. It may draw less current if the charger cannot supply the highest amount, but then as in one of the above posts, it simply takes longer to recharge.
With these devices, milliWatts/miliAmps are the scale, 5V is generally the fixed Potential Difference.
Used in a vehicle, the device is generally both drawing and expending energy (ie. charging and running say, GPS) simultaneously. This in/out situation when prolonged is the cause of the observed overheating with the original X1 battery."
Click to expand...
Click to collapse
Bottom line... Make sure the voltage is 5V, the amperage doesn't really matter.
thanx for your feedback.
both chargers have the same output (5V, 1A) - so I am already successfully using it.
blackberry storm also has micro-usb
hello two chargers and an extra usb-cable
To answer the OP's question...
yes it works fine. I have one in my car and charge my ipod and nexus at the same time without issue. The advanced zune car adapter also works (in case one of the 5 who own a zune are reading this ).
I have been using my old Iphone white plug -> USB adapter for almost a month now and I haven't had any issues.
I tried using the iPhone charger but it charges slow...it does not quick charge. Am I doing something wrong?
uhohhotdog said:
I tried using the iPhone charger but it charges slow...it does not quick charge. Am I doing something wrong?
Click to expand...
Click to collapse
no. If i'm up to date, and i think i am, you'll only get the 'fast charge' if you use the charger that came with your phone. the usb to usb micro chargers all charge slow
Can i use X10 mini charger (5v DC 850mA output) on nexus one by the things i read i understand that i can't but my english is not very good. Thanks
if youre talking about using the iphone usb cable on your n1, HAHAAHHAAHAHAHAH
if you mean using the usb-to-AC adaptor then yea you can use any usb cable to charge anything pretty much..

Charger Compatibility - 550mA vs. 700mA

I have older chargers for that output 5.0v / 550mA and have noticed that the Captivate outputs 5.0v / 700mA.
1. Can I safely use the older, 550mA chargers with the Captivate and what will the effect be?
2. Can I safely use the Captivate's 700mA charger with the older phones and what will the effect be?
Thanks.
1. Yes, you can use it. But it will charge slower.
2. Yes, you can use it. I believe that just because the charger output is higher amperage doesn't mean that it will affect the phone adversely. Think of it this way: A lamp is plugged into the wall outlet at your house. That outlet is rated at 120v 15A. The bulb isn't using all 15 amps, so no problem. But if you were to turn the voltage up or down, the lamp will get brighter or dimmer respectively.
The Captivate can take up to a 1A (1000 mA) charger.
Truceda said:
1. Yes, you can use it. But it will charge slower.
2. Yes, you can use it. I believe that just because the charger output is higher amperage doesn't mean that it will affect the phone adversely. Think of it this way: A lamp is plugged into the wall outlet at your house. That outlet is rated at 120v 15A. The bulb isn't using all 15 amps, so no problem. But if you were to turn the voltage up or down, the lamp will get brighter or dimmer respectively.
The Captivate can take up to a 1A (1000 mA) charger.
Click to expand...
Click to collapse
Another question. You sound smart on this so how about using the nexus one car charger on this? Fits perfect and I don't see why not, but still Leary. .. what do you think?
As long as it puts out 5v, (which is USB standard), you are fine. The amperage is only relevant if the device REQUIRES it. For instance if the device draws 1A and your charger could only handle 550mA your device will charge very slowly.
On the other hand if your charger can handle 1.2A and your device only draws 700mA, then your charger will only output 700mA.
The important thing is the voltage, it needs to be 5v +/- 3% ...
I actually use a generic car charger I bought at walmart with 2 USB ports on it, and it works well for every USB powered device I own ... ZUNE, iPOD, phones etc.
OK. Thanks for the reply. I had read that a phone requiring 700mA that uses a 550mA charger could damage the charger and possibly the phone. That's what made me wonder. And that's what led to the question.
Let's make this a bit more interesting. There's a local, highly-reputable cell phone repair store that has stopped selling car chargers because, they say, the rapid charge is not good for the phone's battery. Their recommendation is to use an inverter (no, they don't sell them) so that you can then plug a standard wall charger into it or a USB cable if the inverter is so equipped. The AC current that results from utilizing the inverter is more consistent than the current flowing from a car charger. So...I purchased an inverter for less than $20 and use it to charge the Captivate in my car.
OK ... not sure we need to get this far down in the weeds on this but here goes ....
The USB2 standard for power distribution is 5v and the thresholds are 4.4-5.25V.
Power is supplied in units of 5v power ... 1 unit is 5v at 100mA, no device can draw more than 5 units from any one port. If you have ever seen a portable hard drive with 2 USB connectors it is because it requires more than 500mA to operate and by using 2 ports the device can draw up to 1A. For dedicated chargers the 4.4-5.25v still applies but shorting the D+/- and disabling the data connection allows the device to detect that it is connected to a dedicated charging port and draw a maximum of 1.8A.
In keeping with the above guidelines, when connected to your computer the Captivate can draw no more than 1 unit of power which is [email protected], when connected to a dedicated charger the phone can draw [email protected] and stay within the standard. (yes, it caps itself at 1A, I know).
OK ... the next bit is going to be hard to digest because there are plenty of examples to the contrary ... there is a standard for mobile USB chargers, and it requires wiring them as dedicated charging ports. What this means to us is that, in theory anyway, a mobile USB charger should allow a device to draw up to 1.8A from it (highly unlikely ... but that's the standard as written).
Here is the problem, if the device is plugged into a dedicated charging port and tries to draw it's maximum rated current, that amount of current may not always be available or it may fluctuate. This fluctuation is what causes problems. Have you ever turned you car stereo up real loud and seen your headlights dim in beat with the music? Same thing, the power system is being drawn down. There are a couple of ways to stabalize your power system ... install a large capacitor (mine is 2 Farad) to provide "conditioning", or go the transformer route.A tansformer provides conditioning, but only on its own outputs... while a large cap will condition the entire power system if installed correctly.
So yes, using a quick charger on your phone can cause issues if your car has a ****ty power system or a large stereo system which is not set up properly (again, ****ty power system). Make sure your charging device is within the standard, and you should be fine wether it is USB via a cigarette lighter port or a 110V transformer.
I appreciate your detailed, helpful reply.
Other than the important 5v parameter, what I've taken away from your information is that a car charger can be used in a vehicle with a power supply that is known to be stable, and that either a whole-car conditioning system or an inverter should be used on one with a, shall we say, "less than stable" power supply (PG version ).
Jack45 said:
I appreciate your detailed, helpful reply.
Other than the important 5v parameter, what I've taken away from your information is that a car charger can be used in a vehicle with a power supply that is known to be stable, and that either a whole-car conditioning system or an inverter should be used on one with a, shall we say, "less than stable" power supply (PG version ).
Click to expand...
Click to collapse
Pretty much ...
There's more than one way to skin a cat, and that Transformer will cost you a whole lot less than a 2 Farad capacitor. I hate cats, but they serve a purpose.
@Battlehymn - one more question for you
I have converted from iPhone to the captivate (no haters please) - finally found a physical form factor with specs i like and the captivate rocks.
Anyway, I have some extra external batteries I used with my iphone that i want to use with my Captivate. I just bought a Female iPod connector and I am planning to connect it to a micro USB connector - the pinout is straight forward, but here is my question:
Should I connect the D+/- (short them together)? That is my plan. My batteries are 1900 or 1000 mAH - I assume that even if the phone tries to the draw 1.8A, the batteries have a circuit to only discharge so fast.

[Resolved] Can i charge headphones with higher mA without issues?

I spent much time to research, ask sony support, google... but i dont have answer yet...
I have got: (Everything is 5.0VDC)
Xperia Tablet Z (It comes with 1500mA charger)
MDR1-RBT BT Headphones (It comes with no charger, charge only via USB PC port - 500mA?)
MW600 BT Headset (It comes with 350mA cherger)
Can i charge all with 1500mA charger without ANY issues?
Please help me i can't get answer for long...
Please help.
BUMP
It'll eventually damage it
Sent from my Karbonn A15 using XDA Premium 4 mobile app
Will damage the headphones
Sent from my GT-I9100G using xda app-developers app
arvin07143 said:
Will damage the headphones
Sent from my GT-I9100G using xda app-developers app
Click to expand...
Click to collapse
So i can charge it only via USB to PC?
Sony supporter said i can charge it with adaptor with specs 1500mA...
I don't understand it at all...
BUMP!
This one is not quite so straightforward without knowing about the internal circuitry of those devices you have in regards to how they charge.
Now, in general, it is safe to say that you must always match the voltage and polarity of the device you are charging, often you can use a higher rated current charger or power supply.
Why?
A device typically only draws how many amps it needs in order to function.
Take your TV for example, it probably only needs 2-3 amps to run yet it is plugged straight into the mains which has the capability of supplying many hundred of amps yet your TV doesn't "fry", it simply draws what it needs.
When it comes to charging its a bit more complicated and depends on the device (does it have internal charging circuitry or is the the item you plug into the wall and actual charger and not simply an AC/DC converter).
Most small electronics which contain a battery and are simply supplied with an AC/DC converter, say, Input 120V 1A output 5V 100ma have their own current regulating charging circuitry inside. This means that the thing you plug into the wall isn't actually a charger, its an AC/DC converter.
In that case the converter can have a higher rated amp output than your device states it needs because your device will only draw what it wants.
If what you are plugging into the wall is an actual charger which has the job of regulating the current flow to the device that needs charging then you will want to get one as close to spec as the original so as not to damage your device.
All the devices you list seem to charge via 5V, most likely a USB type interface.
This means that they are not supplied with actual chargers, simply power supplies, the charging and current limiting mechanism is inside each device.
In theory therefore you should be able to 'charge' all your devices via the 1500mA AC/DC converter, the products you have will only draw what they need in order to charge, the 1500mA will not be pushed or forced upon each device.
Also, think of this, you have been smart and come to ask for advice, how many people simply use the USB 'charger' of their husband/wife/sister/brother to charge their phone or product when they can't find their own and those devices aren't damaged.
The only thing is if the current is too low then it will either not charge or take much longer.
I hope that helps.
zasy99 said:
This one is not quite so straightforward without knowing about the internal circuitry of those devices you have in regards to how they charge.
Now, in general, it is safe to say that you must always match the voltage and polarity of the device you are charging, often you can use a higher rated current charger or power supply.
Why?
A device typically only draws how many amps it needs in order to function.
Take your TV for example, it probably only needs 2-3 amps to run yet it is plugged straight into the mains which has the capability of supplying many hundred of amps yet your TV doesn't "fry", it simply draws what it needs.
When it comes to charging its a bit more complicated and depends on the device (does it have internal charging circuitry or is the the item you plug into the wall and actual charger and not simply an AC/DC converter).
Most small electronics which contain a battery and are simply supplied with an AC/DC converter, say, Input 120V 1A output 5V 100ma have their own current regulating charging circuitry inside. This means that the thing you plug into the wall isn't actually a charger, its an AC/DC converter.
In that case the converter can have a higher rated amp output than your device states it needs because your device will only draw what it wants.
If what you are plugging into the wall is an actual charger which has the job of regulating the current flow to the device that needs charging then you will want to get one as close to spec as the original so as not to damage your device.
All the devices you list seem to charge via 5V, most likely a USB type interface.
This means that they are not supplied with actual chargers, simply power supplies, the charging and current limiting mechanism is inside each device.
In theory therefore you should be able to 'charge' all your devices via the 1500mA AC/DC converter, the products you have will only draw what they need in order to charge, the 1500mA will not be pushed or forced upon each device.
Also, think of this, you have been smart and come to ask for advice, how many people simply use the USB 'charger' of their husband/wife/sister/brother to charge their phone or product when they can't find their own and those devices aren't damaged.
The only thing is if the current is too low then it will either not charge or take much longer.
I hope that helps.
Click to expand...
Click to collapse
I know that Xperia tablets and phones won't take more mA than it needs but does MDR1-RBT has the same thing? I did a lot of research but i still can't find the 100% answer. I don't want to try unless 100% sure. (Its expensive headphones)
EDIT:
Few sony supporters said: You can charge it only via USB cabble to Personal computer
Few sony supporters said: You can charge it only via USB cabble to PC or Adaptor (Linked me to USB/AC Adaptor with THE SAME specs like XTZ charger - 1500 mA)
Is Adaptor same as XTZ?
I found this on OFFICIAL Sony eSupport
http://docs.esupport.sony.com/portable/MDR1RBT_guide/en/contents/01/03/01/01.html?search=charg
OFFICIAL TIPS:
Tips
If the micro-USB cable is connected to a computer while the headset is turned on, the headset will be turned off automatically.
To charge the headset from an AC outlet, use USB Charging AC Power Adaptor* (sold separately). For details, refer to the manuals supplied with the USB Charging AC Power Adaptor.
The headset cannot be turned on while charging the battery.
* Refer to the Reference Guide on the recommended adapter.
Click to expand...
Click to collapse
BUMP
Zas99 answer is spot on.once the voltage matches to your device your ok.most phone chargers you can pull out the usb charging cable. u could use that or buy a another usb charger at 1500ma to 2000ma.
Sent from my HTC Desire S using xda premium
E7ite: You should be fine, the fact that it charges via USB means that it must have its own internal charging circuitry.
I bet your Xperia tablet also charges via a USB type cable right, on a USB port?
All my USB 'chargers' have different mA rating yet they charge my products just fine as long as the mA rating is at LEAST what the device needs, not less, and hey, if its a little less it also works, just takes longer.
Look at it this way, the USB specs are always changing as is their current handling capability, now we are at USB 3.1, just recently we were at USB 3.0 which bumped up maximum device current usage during data transmission to 900mA or for dedicated charging 1.5A.
For USB 1.x - 2.0 it was 100mA before negotiation, up to 500mA thereafter, also with a dedicated charging capacity of 1.5A if the D- & D+ pins are shorted.
I bet Sony doesn't tell you that you can only charge via a USB 1.x - 2.0 port do they? Nope, that means if your computer has a USB 3 port that its has the potential to deliver more, remember, that's the key, it CAN deliver more if needed but it won't, it will only give the device what it draws.
The cable they supply with your headphone is most likely not a data cable but rather a so called charging cable so therefore is rated up to 1.5A or 1500mA but your headphones will only draw what they need to charge.
Everyone uses the term charger which is so wrong, when you put your AA or AAA batteries into the little black box and plug it into a wall, that is a charger, when you connect your car battery via the +/- terminals and clamps to the charger, that is also a real charger, they are directly responsible for charging the batteries.
All these other things we use to 'charge' our little electronic devices do nothing other than take the 120V A/C and convert it into for example 5V DC. The current rating they indicate next to the 5V is simply the max current handling capability, nothing more, it not what is pushed to the device, its what the device can request up to to be delivered. The actual charging goes on inside the device.
By the way, as the batteries charge the device reduces the current and sometimes also the voltage, batteries typically charge faster when they are low and able to absorb more current then as they fill up and can absorb less the charger adjusts to a more or less trickle such as 100mA or less in order to maintain but not overcharge the battery.
By the way, this is what I found on Amazon in about 2 minutes, a review from someone else who bought the headphones.
"For the price, sony should have included a wall charger. Instead they included a USB cable. The headphones will charge on any computer, but using a generic USB wall charger will show a blinking amber error light--but the headphones still charge."
Also, the link you indicated to the sony manual about charging gives absolutely no information at all.
You are over thinking it, just try it
If you are really worried why not get a regular cell phone USB charger and use that, one that states like 500mA on it and has a female USB plug on it. That way you can use your headphone USB cable and plug it into the 5V 500mA phone 'charger'.
One last thing, just don't use Apple product 'chargers' to charge non Apple products, why you ask, because Apple does strange things with the pins on their 'chargers' in order to indicate the supply current. This can sometimes lead to adverse effects for non Apple devices.
Good luck.
zasy99 said:
E7ite: You should be fine, the fact that it charges via USB means that it must have its own internal charging circuitry.
I bet your Xperia tablet also charges via a USB type cable right, on a USB port?
All my USB 'chargers' have different mA rating yet they charge my products just fine as long as the mA rating is at LEAST what the device needs, not less, and hey, if its a little less it also works, just takes longer.
Look at it this way, the USB specs are always changing as is their current handling capability, now we are at USB 3.1, just recently we were at USB 3.0 which bumped up maximum device current usage during data transmission to 900mA or for dedicated charging 1.5A.
For USB 1.x - 2.0 it was 100mA before negotiation, up to 500mA thereafter, also with a dedicated charging capacity of 1.5A if the D- & D+ pins are shorted.
I bet Sony doesn't tell you that you can only charge via a USB 1.x - 2.0 port do they? Nope, that means if your computer has a USB 3 port that its has the potential to deliver more, remember, that's the key, it CAN deliver more if needed but it won't, it will only give the device what it draws.
The cable they supply with your headphone is most likely not a data cable but rather a so called charging cable so therefore is rated up to 1.5A or 1500mA but your headphones will only draw what they need to charge.
Everyone uses the term charger which is so wrong, when you put your AA or AAA batteries into the little black box and plug it into a wall, that is a charger, when you connect your car battery via the +/- terminals and clamps to the charger, that is also a real charger, they are directly responsible for charging the batteries.
All these other things we use to 'charge' our little electronic devices do nothing other than take the 120V A/C and convert it into for example 5V DC. The current rating they indicate next to the 5V is simply the max current handling capability, nothing more, it not what is pushed to the device, its what the device can request up to to be delivered. The actual charging goes on inside the device.
By the way, as the batteries charge the device reduces the current and sometimes also the voltage, batteries typically charge faster when they are low and able to absorb more current then as they fill up and can absorb less the charger adjusts to a more or less trickle such as 100mA or less in order to maintain but not overcharge the battery.
By the way, this is what I found on Amazon in about 2 minutes, a review from someone else who bought the headphones.
"For the price, sony should have included a wall charger. Instead they included a USB cable. The headphones will charge on any computer, but using a generic USB wall charger will show a blinking amber error light--but the headphones still charge."
Also, the link you indicated to the sony manual about charging gives absolutely no information at all.
You are over thinking it, just try it
If you are really worried why not get a regular cell phone USB charger and use that, one that states like 500mA on it and has a female USB plug on it. That way you can use your headphone USB cable and plug it into the 5V 500mA phone 'charger'.
One last thing, just don't use Apple product 'chargers' to charge non Apple products, why you ask, because Apple does strange things with the pins on their 'chargers' in order to indicate the supply current. This can sometimes lead to adverse effects for non Apple devices.
Good luck.
Click to expand...
Click to collapse
Thank you for reply.

[Q] PowerBank /External Battery Pack

Does anyone know if there is any way to use the Surface RT with an external battery pack?
I know there are surface RT compatible powerbanks on sale right now but those are really expensive!
Afaik, the surface charges with 12V whereas USB ports output at 5V.
The only difference that I can tell is that the power output is different. This should just result increased charging times but should not have any safety issues at all.
If you aren't aware what a difference between 12v and 5v means in electronics, you probably shouldn't be giving advice like "should not have any safety issues at all". I mean, you're probably right in this case (though only "probably") but if you think that voltage = power output, it's only by sheer luck.
To answer your actual question, I am not. However, I strongly advise against simply... experimenting with this sort of thing on your own. You could easily start an electrical fire, damage your tablet, or do one of many other unpleasant things.
The surface uses a 7.4V battery pack. Attempting to connect a 5v supply to charge without a stepup converter will not charge the surface at all and instead lead to damaging your 5v supply as the surface attempts to supply 7.4v to it, which may also damage the surface too.
In electronics nothing is more important than the correct voltage. Take a raspberry pi, many people (including myself) use them for electronics projects. Both me and a friend added ultrasonic rangefinders to our pi's except I observed the fact that the module in question uses 5v signalling and my friend did not. We both connected the +5v and ground lines to the corresponding points on our pi's. I connected 1 GPIO pin of the pi to the trigger pin of the HC-SR04 (module in question) as did friend. The pi uses 3.3V on its GPIO pins, the HC-SR04 is safe with a 2.7v-5v trigger voltage, thats fine. The echo pin however is +5v. I ran my echo pin through a voltage divider to give myself a 3.0v output instead, this is safe for the pi (it is fine with 2.7-3.3). My friend did not, he connected the echo pin direct to the pi, tried to use the sensor for the first time ever and found he didnt get any results, do a few test lighting an LED and he found he could no longer use that pin, he damaged it.
I would be more concerned about the surface damaging the supply than the other way around in this case.
However you are correct in believing the device charges at 12V (2A or 4A depending on whether its an RT or pro charger). The surface would have the correct voltage regulation to charge the battery and supply 5v to USB internally. There are 4 wires within the charger. Red: +12V, Black: Ground, Blue: charger detection, Yellow: signal for the charger LED. I don't think its known *exactly* how the yellow and blue wires function, the surface does charge without them. Blue is rumored to be something to do with the RT charger identifying itself as a 2A device and telling the pro to not draw more than 2A (it draws 4) hence damaging the charger but exactly how it does that is unknown. If the 12V supply can supply more than 4A your golden. A supply capable of higher currents than the device needs is actually perfectly safe, current is drawn by the device from the supply, the supply does not force the current at the device. If a device requires 2A and the supply is capable of 10, the supply will still only give out 2 as that is all the device is drawing. A device drawing more than the supply can cope with on the other hand IS DANGEROUS. Lets say you were to use a 5v>12v converter from a computer USB port to charge a surface, a computers USB only supplies half an amp under normal conditions (and by shorting D+ and D- it can do 1.2), the pro trying to draw 4 combined with the inefficiency of 5v-12v stepup converters will cause a current draw so large on the USB port that I would be surprised if damaging the USB port is the only thing to happen, you could fry the entire USB controller easily, damaged USB controllers can then cause their own issues, primarily with this amount of current the possibility of fire.
If you want to charge your surface. Find a voltage source *above* 12V (but not too high above) and use a voltage regulator to make a clean 12V. Then make sure both the regulator and battery can cope with the current. The regulator will determine how far above 12V you are safe to go, some might safely regulate 100V to 12 (most likely a switch mode regulator which although highly efficient won't output a clean 12V, it will have alot of voltage drops and spikes) whereas some might only do 20 to 12 (linear, much less efficient but a very clean 12v, stick a heatsink on it as they get warm). A car battery btw is not 12V, its 12.6 nominal and when fresh off the charger as much as 14.
Yeah, I know the basics of electrical physics. Not a complete speculating noob.
P=IV
Power of a 5V 2.1A is 10.5w
The original power adapter supplies 12V at 2A = 24w
My physics may be rusty but, I was under the impression that it would still charge at lower power output albeit half as fast. I've read that the RT when using the 48W charger would still charge at the same rate as the 24w. This means that the current that is drawn by the surface rt is capped at 2A. Hence, I believe that there would be any significant issues with drawing 2A from a charger that is designed to output 2A anyway though at a lower voltage.
If I'm still wrong here, would love some clarification.
lambstone said:
Yeah, I know the basics of electrical physics. Not a complete speculating noob.
P=IV
Power of a 5V 2.1A is 10.5w
The original power adapter supplies 12V at 2A = 24w
My physics may be rusty but, I was under the impression that it would still charge at lower power output albeit half as fast. I've read that the RT when using the 48W charger would still charge at the same rate as the 24w. This means that the current that is drawn by the surface rt is capped at 2A. Hence, I believe that there would be any significant issues with drawing 2A from a charger that is designed to output 2A anyway though at a lower voltage.
If I'm still wrong here, would love some clarification.
Click to expand...
Click to collapse
The wattage is not of concern. The voltage and current must be treated independently. Also, consider that you are trying to charge a 7.4v battery from a 5v supply, the current will not flow from the 5v to the 7.4v (assume old electrical flow not real electron flow), it will end up doing the opposite, 7.4v flowing towards the 5v.
So lets say you step up the 5v to 12v, there is inefficiency in doing that. To get 12V @2A you would not be drawing 5v @2A, you would more likely be drawing 5V @4 or 5A.
As 6677 says, the Voltage and current (and not the power) are what matters. Power (Wattage) is a derived value (and you correctly gave the derivation); it's handy for making comparisons when certain core assumptions are valid but meaningless on its own. In this case, the core assumptions are *not* valid: not all chargers are producing the same Voltage.
To take another example of where assuming all Watts are created equal would get you in trouble, consider an incandescent light bulb. Light bulbs are rated in Watts because incandescent bulbs operate by turning electrical power (Watts) into heat (also measurable in Watts) via resistive filaments that can only dissipate a limited amount of heat. Thus, a 60W bulb has a filament designed to dissipate 60W of heat. However, since power is a derived value, let's look at what it actually means: In the USA (mains run at about 120V) we can use Ohm's Law: P=I*V; V=I*R, so I=V/R, so P=V*V/R=V^2/R, so R=V^2/P. Therefore, the resistance of a 60W bulb in the USA is (120^2)/60=2*120=240 Ohms. Cool. Now, let's take that 60W bulb from the USA and connect it to European mains. It should still consume 60W of energy and produce the same amount of light, right? After all, it says "60W" right on the box! Hmm... but European mains run at 240V. The resistance doesn't change; it's a physical property of the filament. So, using P=V^2/R, we get P=240*240/240=240W That's four times as much as the bulb is meant to handle; it will flash very brightly (much brighter than normal) and burn out instantly! The bulb isn't defective, it just wasn't made to handle 240V. The "60W" on the box is based on an assumption of 120V; the bulb itself does nothing to ensure that it only consumes 60W.
While the basics of Ohm's Law will equip you to understand things like I describe above, it is *not* enough (by itself) to make assumptions about complex electronics. Even a 101-level electrical engineering course will make that abundantly clear; there are many more forces at play in the world of electronics than the familiar and relatively-easily-understood Voltage, current, resistance, and power.
Thanks guys for clearing the air. Man, I thought it would have been possible to jerry rig a Surface Charging USB cable to be used with battery packs.
So from what I understand, to avoid unnecessary complications, it would be best to find a way to supply 2A @ 12V. Obviously finding a power supply that natively does so is most ideal. But there is an alternative of stepping up the voltage from 5V to 12V with will lead to losses due to conversion inefficiency which basically rules this out.
12V Power Bank
I found a 12V 2.5A external battery pack listed above. This seems to work doesn't it? It has a 12V output with a max draw of 2.5A. The Surface RT draws a maximum of 2A from your power source. I can simply splice the provided cables with the Surface Charging Cable and it sounds to me it'll work.
Since the Surface RT is reported to have a 31.5Wh battery, this battery pack seems to have a 78Wh battery making it rather ideal.
If it is indeed 12V @2A or more then yes, it should charge an RT
Use a power gorilla I use one for my macbook pro and other usb items
Just saw flyer for Surface pro/RT powerbank
Got an email with flyer for Surface + laptop Powerbank. I haven't put an order yet but you can check it out at -> qi-infinity website
It seems they already include Surface adapter and its on sale. I have Lenovo X1 carbon laptop and they have proprietary connector for that one included as well. So I might go with them

[Q] USB Cable Length and Charge Rate?

So I'm looking to purchase a slew of new chargers for myself and my wife for at home, office in the car, etc... She has a Galaxy S4 and Ipad3; I have Galaxy S3 and Galaxy Note 10.1 2014. I'm at least somewhat aware of the charger requirements and plan to supply at least 2 amps to each device in general.
My real confusion is in the USB cable length. It seems most stock cables are roughly 3ft long and from what I can tell, going with a longer cable increases resistance, causes a voltage drop and lowers the current/charge rate. Of course a higher gauge cable will help decrease this resistance, but I'm already planning ot go with 24AWG cables.
So my question is, how long of a USB cable can I use before it has a noticable effect on my charge rate? If anything over 3ft significantly slows it down, then I'll probably just use AC extension cables instead of longer USB cables when necessary. What cable length vs charge rate do you find acceptable?
Thanks in advance!
I have a 10 foot cord that I got from mobstub and I noticed a significant increase in the time it takes to charge. I'm not sure how much degraded it is but if there's an app or something that can show me that I'll tell you the stats.
Sent from my SPH-L720 using xda premium
low voltage / low current
I don't have any empirical data to back it up, but at 5VDC & 2A or less (typically), I would not expect much of a drop in charge voltage/current due to length (10 ft or less). That's just a gut feel from building lots of cables & electronic assemblies for the fun of it !!
Good luck & have fun !!
Mark J Culross
KD5RXT
mjculross said:
I don't have any empirical data to back it up, but at 5VDC & 2A or less (typically), I would not expect much of a drop in charge voltage/current due to length (10 ft or less). That's just a gut feel from building lots of cables & electronic assemblies for the fun of it !!
KD5RXT
Click to expand...
Click to collapse
If your cables are thin - don't even expect to deliver 2000mA on them. The thickness is the key here.
bmather9 said:
So I'm looking to purchase a slew of new chargers for myself and my wife for at home, office in the car, etc... She has a Galaxy S4 and Ipad3; I have Galaxy S3 and Galaxy Note 10.1 2014. I'm at least somewhat aware of the charger requirements and plan to supply at least 2 amps to each device in general.
My real confusion is in the USB cable length. It seems most stock cables are roughly 3ft long and from what I can tell, going with a longer cable increases resistance, causes a voltage drop and lowers the current/charge rate. Of course a higher gauge cable will help decrease this resistance, but I'm already planning ot go with 24AWG cables.
So my question is, how long of a USB cable can I use before it has a noticable effect on my charge rate? If anything over 3ft significantly slows it down, then I'll probably just use AC extension cables instead of longer USB cables when necessary. What cable length vs charge rate do you find acceptable?
Thanks in advance!
Click to expand...
Click to collapse
At the voltages, gauges, and distances we are talking here, you are way over thinking this. It will certainly be fine at least up until the USB data transmission cable limit of 5 meters.
http://forum.xda-developers.com/showthread.php?t=2545497
According to the results from this other thread the wire length really seems to make a drastic difference.
I've heard quite a few people say that voltage drop will be minimal, but also heard of people being unable to charge their ipad 3 with a 10 ft cable.
I'd really like to use longer cables since they are generally more convenient to use the phone while plugged in, but the charge rate is also very important. So I'd like to get a feel as to what I'd be sacrificing by using longer cables.
I'm certainly overthinking this; I'm an engineer...that's what I do
Regardless, I'm planning to purchase quite a few cables and figured I should do so with some intelligence.
So with 2 votes so far for "1.5 ft or less" does that mean that people are really using even shorter cables to get better charge rates?
wire size vs voltage drop
OK, in place of my "gut feel" in my earlier post, here's just one website that allows you enter the parameters of your cable (size in AWG gauge, voltage, length & current) & then calculates the theoretical voltage drop through your cable: http://www.powerstream.com/Wire_Size.htm (scroll down to the bottom to find the calculator). For example, according to their calculations, a three foot cable using 24 gauge wires carrying 2A current would impart a little over a 0.3VDC drop. If your charger is supplying 5VDC at the source, then only 4.7VDC would make it to your phone for use in charging at the full 2A rate. Contrast this with a ten foot cable of the same size under the same conditions which suffers a little more than a 1VDC drop, resulting in only 4VDC being available at your phone at the full 2A rate for charging.
However, I would not expect the phone to continue to try to draw 2A of current under these conditions (particularly the 10 foot cable), else charging may not take place at all if/when the voltage is too low. Instead, I would expect that the charging circuit on the phone would diminish its current draw (to something LESS than 2A) in an attempt to keep the voltage closer to the desired 5VDC (or whatever the spec'd minimum is to charge the specific battery, assuming that the charger itself is putting out a nearly constant amount of power, somewhere near to its rated number of watts).
It's very likely because of this reduction in current that your overall charging rate is reduced (or to put it another way, your overall charging time is increased) on lesser size cables, etc.
YMMV . . .
Good luck & have fun !!
Mark J Culross
KD5RXT
mjculross said:
OK, in place of my "gut feel" in my earlier post, here's just one website that allows you enter the parameters of your cable (size in AWG gauge, voltage, length & current) & then calculates the theoretical voltage drop through your cable: (scroll down to the bottom to find the calculator). For example, according to their calculations, a three foot cable using 24 gauge wires carrying 2A current would impart a little over a 0.3VDC drop. If your charger is supplying 5VDC at the source, then only 4.7VDC would make it to your phone for use in charging at the full 2A rate. Contrast this with a ten foot cable of the same size under the same conditions which suffers a little more than a 1VDC drop, resulting in only 4VDC being available at your phone at the full 2A rate for charging.
However, I would not expect the phone to continue to try to draw 2A of current under these conditions (particularly the 10 foot cable), else charging may not take place at all if/when the voltage is too low. Instead, I would expect that the charging circuit on the phone would diminish its current draw (to something LESS than 2A) in an attempt to keep the voltage closer to the desired 5VDC (or whatever the spec'd minimum is to charge the specific battery, assuming that the charger itself is putting out a nearly constant amount of power, somewhere near to its rated number of watts).
It's very likely because of this reduction in current that your overall charging rate is reduced (or to put it another way, your overall charging time is increased) on lesser size cables, etc.
YMMV . . .
Good luck & have fun !!
Mark J Culross
KD5RXT
Click to expand...
Click to collapse
Thanks for that explanation. It seems that even a 6 ft USB cable will significantly slow charging, and that a 10 ft even more so to the point that it may not even charge sometimes. So its looking like 3ft USB cables with AC extensions where necessary is the way to go. Maybe I'll try some 1.5 ft as well, but not sure how practical they will be for using the devices while plugged in, even with the AC extension.
If anyone has another opinion please voice it.

Categories

Resources