[Q] Charger Harmful? - HTC Sensation

I just got my hands on a Griffin AC Wall Charger (USB) for the iPad and iPhones/iPods. It outputs 2.1 amps. Would it be harmful to use the charging block on my HTC Sensation, since the charger that came with it is at 1 amp?

And the survey says....YES
Sent From My Pocket

Why do you say that? Why would it damage anything?

From what i've learned, more ampere is okay, less is harmful.
If the charger produces more ampere than needed, the device
will still just consume the 1A it's supposed to get from the charger.
If the charger produces less ampere than needed, the charger
tries constantly to supply the device with 1A which it can not reach,
resulting in overheating and maybe destroying the charger
(and thus maybe damaging the device).
Are both 5V? I guess more than this could be harmful.
It's been a few days since i had this in school, so i can not guarantee for correctness,
maybe some of the smarter guys/girls here could acknowledge/deny my statements just to be sure.

brndklng said:
From what i've learned, more ampere is okay, less is harmful.
If the charger produces more ampere than needed, the device
will still just consume the 1A it's supposed to get from the charger.
If the charger produces less ampere than needed, the charger
tries constantly to supply the device with 1A which it can not reach,
resulting in overheating and maybe destroying the charger
(and thus maybe damaging the device).
Are both 5V? I guess more than this could be harmful.
It's been a few days since i had this in school, so i can not guarantee for correctness,
maybe some of the smarter guys/girls here could acknowledge/deny my statements just to be sure.
Click to expand...
Click to collapse
You are correct.
The phone (which actually contains the charger) will pull as much current as it needs from the power supply (the thing we plug into the wall and incorrectly call a charger).
Part of the charging circuit should stop the phone drawing more current than the PSU can supply and causing damage.
Voltage is the important one that must be 5v DC.
Idealy you need a PSU that can provide more current than the device can possibly drain, that way it will relieve the load on the battery and the excess current can then go towards charging the battery.

I would figure it would draw too much and damage it. I had one of those iGo chargers and it was only 0.75 amps and it made noise so I stopped using it altogether. The charger's I'm getting are rated at 2.1amps and have a DC current of 5V so from what I read above it should be alright since the phone limits the charger to 1amp.

The mA (or A) rating on the PSU is the maximum it can provide, not the amount it will actually pull.
The amount it pulls is down to the resistance of the device.
Think of your car's alternator, mine provides 95A.
My headlight bulbs are 55W, so at 14V they each draw only about 4A of that 95A.
Using the formula V=IR, rearranged to R=V/I we get 14/4=3.5 Ohms, so the resistance of each bulb is 3.5 Ohms.

xaccers said:
The mA (or A) rating on the PSU is the maximum it can provide, not the amount it will actually pull.
The amount it pulls is down to the resistance of the device.
Think of your car's alternator, mine provides 95A.
My headlight bulbs are 55W, so at 14V they each draw only about 4A of that 95A.
Using the formula V=IR, rearranged to R=V/I we get 14/4=3.5 Ohms, so the resistance of each bulb is 3.5 Ohms.
Click to expand...
Click to collapse
+1
Good to see there are a few people here who know, and can apply, Ohm's Law.
Also, I have the Griffin charger that looks like a small cube. It works fine.
The noise you hear is from the PSU circuit in the supply (SMPS = switch mode power supply). It takes your house voltage/frequency and converts it to a high frequency/low voltage DC for your phone's charging circuit to use. Thats the noise, and it is normal, although bothersome!
Matt

mrg02d said:
+1
Good to see there are a few people here who know, and can apply, Ohm's Law.
Matt
Click to expand...
Click to collapse
I fear we may be a dying breed Matt

Related

Charger Compatibility - 550mA vs. 700mA

I have older chargers for that output 5.0v / 550mA and have noticed that the Captivate outputs 5.0v / 700mA.
1. Can I safely use the older, 550mA chargers with the Captivate and what will the effect be?
2. Can I safely use the Captivate's 700mA charger with the older phones and what will the effect be?
Thanks.
1. Yes, you can use it. But it will charge slower.
2. Yes, you can use it. I believe that just because the charger output is higher amperage doesn't mean that it will affect the phone adversely. Think of it this way: A lamp is plugged into the wall outlet at your house. That outlet is rated at 120v 15A. The bulb isn't using all 15 amps, so no problem. But if you were to turn the voltage up or down, the lamp will get brighter or dimmer respectively.
The Captivate can take up to a 1A (1000 mA) charger.
Truceda said:
1. Yes, you can use it. But it will charge slower.
2. Yes, you can use it. I believe that just because the charger output is higher amperage doesn't mean that it will affect the phone adversely. Think of it this way: A lamp is plugged into the wall outlet at your house. That outlet is rated at 120v 15A. The bulb isn't using all 15 amps, so no problem. But if you were to turn the voltage up or down, the lamp will get brighter or dimmer respectively.
The Captivate can take up to a 1A (1000 mA) charger.
Click to expand...
Click to collapse
Another question. You sound smart on this so how about using the nexus one car charger on this? Fits perfect and I don't see why not, but still Leary. .. what do you think?
As long as it puts out 5v, (which is USB standard), you are fine. The amperage is only relevant if the device REQUIRES it. For instance if the device draws 1A and your charger could only handle 550mA your device will charge very slowly.
On the other hand if your charger can handle 1.2A and your device only draws 700mA, then your charger will only output 700mA.
The important thing is the voltage, it needs to be 5v +/- 3% ...
I actually use a generic car charger I bought at walmart with 2 USB ports on it, and it works well for every USB powered device I own ... ZUNE, iPOD, phones etc.
OK. Thanks for the reply. I had read that a phone requiring 700mA that uses a 550mA charger could damage the charger and possibly the phone. That's what made me wonder. And that's what led to the question.
Let's make this a bit more interesting. There's a local, highly-reputable cell phone repair store that has stopped selling car chargers because, they say, the rapid charge is not good for the phone's battery. Their recommendation is to use an inverter (no, they don't sell them) so that you can then plug a standard wall charger into it or a USB cable if the inverter is so equipped. The AC current that results from utilizing the inverter is more consistent than the current flowing from a car charger. So...I purchased an inverter for less than $20 and use it to charge the Captivate in my car.
OK ... not sure we need to get this far down in the weeds on this but here goes ....
The USB2 standard for power distribution is 5v and the thresholds are 4.4-5.25V.
Power is supplied in units of 5v power ... 1 unit is 5v at 100mA, no device can draw more than 5 units from any one port. If you have ever seen a portable hard drive with 2 USB connectors it is because it requires more than 500mA to operate and by using 2 ports the device can draw up to 1A. For dedicated chargers the 4.4-5.25v still applies but shorting the D+/- and disabling the data connection allows the device to detect that it is connected to a dedicated charging port and draw a maximum of 1.8A.
In keeping with the above guidelines, when connected to your computer the Captivate can draw no more than 1 unit of power which is [email protected], when connected to a dedicated charger the phone can draw [email protected] and stay within the standard. (yes, it caps itself at 1A, I know).
OK ... the next bit is going to be hard to digest because there are plenty of examples to the contrary ... there is a standard for mobile USB chargers, and it requires wiring them as dedicated charging ports. What this means to us is that, in theory anyway, a mobile USB charger should allow a device to draw up to 1.8A from it (highly unlikely ... but that's the standard as written).
Here is the problem, if the device is plugged into a dedicated charging port and tries to draw it's maximum rated current, that amount of current may not always be available or it may fluctuate. This fluctuation is what causes problems. Have you ever turned you car stereo up real loud and seen your headlights dim in beat with the music? Same thing, the power system is being drawn down. There are a couple of ways to stabalize your power system ... install a large capacitor (mine is 2 Farad) to provide "conditioning", or go the transformer route.A tansformer provides conditioning, but only on its own outputs... while a large cap will condition the entire power system if installed correctly.
So yes, using a quick charger on your phone can cause issues if your car has a ****ty power system or a large stereo system which is not set up properly (again, ****ty power system). Make sure your charging device is within the standard, and you should be fine wether it is USB via a cigarette lighter port or a 110V transformer.
I appreciate your detailed, helpful reply.
Other than the important 5v parameter, what I've taken away from your information is that a car charger can be used in a vehicle with a power supply that is known to be stable, and that either a whole-car conditioning system or an inverter should be used on one with a, shall we say, "less than stable" power supply (PG version ).
Jack45 said:
I appreciate your detailed, helpful reply.
Other than the important 5v parameter, what I've taken away from your information is that a car charger can be used in a vehicle with a power supply that is known to be stable, and that either a whole-car conditioning system or an inverter should be used on one with a, shall we say, "less than stable" power supply (PG version ).
Click to expand...
Click to collapse
Pretty much ...
There's more than one way to skin a cat, and that Transformer will cost you a whole lot less than a 2 Farad capacitor. I hate cats, but they serve a purpose.
@Battlehymn - one more question for you
I have converted from iPhone to the captivate (no haters please) - finally found a physical form factor with specs i like and the captivate rocks.
Anyway, I have some extra external batteries I used with my iphone that i want to use with my Captivate. I just bought a Female iPod connector and I am planning to connect it to a micro USB connector - the pinout is straight forward, but here is my question:
Should I connect the D+/- (short them together)? That is my plan. My batteries are 1900 or 1000 mAH - I assume that even if the phone tries to the draw 1.8A, the batteries have a circuit to only discharge so fast.

[Q] 1.2 amp charger work on gtablet

I have a 12 volt 1.2 amp charger that fits into the DC in Jack. Can I use this to charge the Gtablet in an emergency (if I lost or broke my original charger) without damaging my Gtab? Want some expert opinions before I make the attempt. Rather be safe than sorry. Thanks in advance.
I'm no expert, but a little math tells you that the charger that you have will only output 14.4 watts VxA=W (12Vx1.2A=14.4W). Since the G-Tab is rated at 12V,2A or 24W you will be under powering it. If I remember correctly a 24V1A supply would work just fine as the output is still 24W. A 12V3A charger would work just as well, as it is designed to feed up to 36W. In this case it would only feed the 24W that the G-Tab pulls.
In the end it would still power the device, however the battery charging would come to a crawl as the the available output is 60% of the rated charger.
I'd be cautious though, as your 1.2A charger may get really hot trying to feed the G-Tab.
I'm sure that someone else can explain this more elegantly than I just did.
Given the lack of voltage it may not work at all. You need enough voltage to overcome any resistance in the circuit or the result is a lot of nothing, or just heat. Been awhile since electronics school though.
Sent from the awesome ZTab
Phantom_Midge said:
I'm no expert, but a little math tells you that the charger that you have will only output 14.4 watts VxA=W (12Vx1.2A=14.4W). Since the G-Tab is rated at 12V,2A or 24W you will be under powering it. If I remember correctly a 24V1A supply would work just fine as the output is still 24W. A 12V3A charger would work just as well, as it is designed to feed up to 36W. In this case it would only feed the 24W that the G-Tab pulls.
In the end it would still power the device, however the battery charging would come to a crawl as the the available output is 60% of the rated charger.
I'd be cautious though, as your 1.2A charger may get really hot trying to feed the G-Tab.
I'm sure that someone else can explain this more elegantly than I just did.
Click to expand...
Click to collapse
I wouldn't be charging while using the Gtab. It would be powered off so there would be no drain and should only be charging. In my layman thinking, it would be a safe charge but only at a slower time rate than a 2 amp charge. If it won't damage my Grab, I'll try it with it powered off. Is this a safe go?
I have one from a tv. It does not work. It makes the charging indicator flicker...
Phantom_Midge said:
I'm no expert,
Click to expand...
Click to collapse
Given the nature of electricity, always a good idea to know what you're talking about before you recommend a course of action, since improper advice can result in death and destruction of equipment. A two-lecture treatment of electricity in physics class that basically boils down to V=IR and P=VI is not sufficient for giving advice about things like this.
Phantom_Midge said:
but a little math tells you that the charger that you have will only output 14.4 watts VxA=W (12Vx1.2A=14.4W). Since the G-Tab is rated at 12V,2A or 24W you will be under powering it. If I remember correctly a 24V1A supply would work just fine as the output is still 24W. A 12V3A charger would work just as well, as it is designed to feed up to 36W. In this case it would only feed the 24W that the G-Tab pulls.
In the end it would still power the device, however the battery charging would come to a crawl as the the available output is 60% of the rated charger.
I'd be cautious though, as your 1.2A charger may get really hot trying to feed the G-Tab.
I'm sure that someone else can explain this more elegantly than I just did.
Click to expand...
Click to collapse
Please do not plug a 24V charger into a device that is supposed to take 12V. Bad things are likely to happen. At best, the GTab has internal protection circuitry to shut things off and prevent internal damage. If not, you're going to ruin the battery or cause things to blow up. It's like plugging your 120V hair dryer into a 240V outlet in Europe. Much smoke to follow.
Power (wattage) is not the relevant issue here. There are actually quite a few things that matter, but for the purposes of this discussion, voltage is the most important one. A 12V 1.2A charger *may* not be able to supply enough power to both run and charge the GTab, but it is unlikely to cause problems. I don't know the internals of the GTab specifically, but my guess is that it regulates the charging current, meaning you could also probably get away with a 12V 3A charger, though if the internal circuitry of the GTab is poorly designed, that might shorten your battery life.
As to your specific concern about heat on the 1.2A charger, heat production is a function both of the efficiency of the particular charger and the amount of power it produces. It's likely the 1.2A charger wouldn't get as hot as the 2A charger, given that its power output is 60% as much, unless it was significantly less efficient. Assuming, of course, that both were operating at maximum power.
ByByIpad said:
I have a 12 volt 1.2 amp charger that fits into the DC in Jack. Can I use this to charge the Gtablet in an emergency (if I lost or broke my original charger) without damaging my Gtab? Want some expert opinions before I make the attempt. Rather be safe than sorry. Thanks in advance.
Click to expand...
Click to collapse
Bottom Line, without all the Hype and Hyperbola....
Yeah, it'll work and destroy nothing, but will take longer to fully charge the battery.
Do not use the tab while it's connected.
ByByIpad said:
I have a 12 volt 1.2 amp charger that fits into the DC in Jack. Can I use this to charge the Gtablet in an emergency (if I lost or broke my original charger) without damaging my Gtab? Want some expert opinions before I make the attempt. Rather be safe than sorry. Thanks in advance.
Click to expand...
Click to collapse
I wouldn't use it. The GTab expects 2 amps to be available, and will not know to throttle the charge back to 1.2. You risk overheating or otherwise damaging the charger, which could then turn around and damage your GTab, depending on the charger's failure mode.
12v and at least 2 amp is what you need. More than 2amp is fine.
Jim
jmdearras said:
I wouldn't use it. The GTab expects 2 amps to be available, and will not know to throttle the charge back to 1.2. You risk overheating or otherwise damaging the charger, which could then turn around and damage your GTab, depending on the charger's failure mode.
12v and at least 2 amp is what you need. More than 2amp is fine.
Jim
Click to expand...
Click to collapse
Yes.
Lithium Polymer batteries require a CV/CC charging scheme. This means "Constant Voltage" and "Constant Current". One after another. Without knowing how the charging circuit is setup, you COULD damage the battery or the charging circuit by not supplying the appropriate power to the circuit.
To summarize.
1.) NEVER use a higher voltage supply, unless you are 100% sure the internal power supplies can handle that voltage. 24VDC = most likely really bad
2.) NEVER use a power supply that is under the rated current of the device. For good quality external AC/DC bricks, it's usually not a concern. For cheaply made, haphazardly designed bricks (mostly from China), this can cause fire.
3.) You can ALWAYS use a brick that is rated at MORE current. That's available current, not "I'm going to shove this current down your throat". Typically, bricks rated at a higher current will supply the required current at a higher efficiency anyway, so pending you don't care about the size increase, it's better. They also tend to supply cleaner power (voltage really), which is always a good thing.
That concludes todays electronics seminar. Class dismissed.

Question about chargers

I have an HTC Explorer, which came with a USB cable and an adapter to allow me to charge via mains as well. (Something like this: http://i.expansys.com/i/g/g232723.jpg - that's not the exact same one, but you get the idea). This charger says on it that it has an output of 5V and 1A. I also have a Blackberry Playbook, which came with a charger with a micro USB connector. However, this charger says on it that it has an output of 5V and 2A.
I have tested the HTC charger with my playbook, and it worked, but the question I want to ask is this: would it be dangerous to try the playbook charger on my HTC phone? (2A output from Playbook charger versus the presumably expected 1A input on the phone) I would like to be able to only carry one charger when travelling and the playbook charger has interchangeable adapters for international plugs, so it would be better to take, but obviously I don't want to overload the phone and have it burst into flames or whatever.
Any advice on the matter would be great, thanks.
It's not recommended by the manufacturer. It can void your warranty, though I don't know how they'd ever find out you were doing so. There can also be issues when using the cable to transfer data and such.
I did watch a video about that Blackberry charger.
Supposedly because of that 2A output, it charges your phone twice as a fast.
Sent from my Ainol Novo7 Elf using xda premium
I don't think there would be considerable damages but..I'd just use them both, even if it's less comfortable
I looked around and I think it should be safe. From what I've read online, the phone will only take a certain amount of current, regardless of the current being made available by the charger. Because the voltage is the same (5V), and the resistance of the phone circuitry is constant, by Ohm's law, I = V/R, so I will always be the same as long as V is the same. Presumably it'll drop with a lower input current, but the max I will always be the same, and that'll be limited to a safe level.
It might take a slightly higher current (say if the max the phone can take is 1.2A or something, the 1A charger can only give 1A but the 2A would give the full 1.2A), but no higher than the circuits in the phone will allow. After reading this, I realised that it made a lot of sense, and I think it's right. I also read that if it does charge the phone faster (which it will if the phone is taking a higher current e.g. 1.2A) it'll reduce the number of charging cycles that you get out of the battery. But the consensus seems to be that doing it every now and again when travelling etc. should be fine.
Thanks for all your responses.
EDIT: I think at worst, I might damage the battery, and they're not overly expensive to replace I don't think.
I agree, the rating on the charger is what it's MAX output, not that it will push that much current to your device. If you have an extra USB cable (that you dont mind cutting apart) and a mulit-meter, you can check to see how much current your phone is pulling from the BB Playbook charger. connect your mult-meter in-line with the red wire in the usb cable (just connect all others together)
Just make sure your mult-meter is rated for at least 2A.
Hope this helps.
Devices with lithium batteries usually have a charger circuit that limit the peak current that is sent to the battery. They also utilize temperature compensation so if you are charging the battery too fast, it starts limiting the charge current.
Where you might have a problem is when it's plugged in and you are using it, especially if the battery is low. You get high charge currents, combined with the operating current.

[Q] PowerBank /External Battery Pack

Does anyone know if there is any way to use the Surface RT with an external battery pack?
I know there are surface RT compatible powerbanks on sale right now but those are really expensive!
Afaik, the surface charges with 12V whereas USB ports output at 5V.
The only difference that I can tell is that the power output is different. This should just result increased charging times but should not have any safety issues at all.
If you aren't aware what a difference between 12v and 5v means in electronics, you probably shouldn't be giving advice like "should not have any safety issues at all". I mean, you're probably right in this case (though only "probably") but if you think that voltage = power output, it's only by sheer luck.
To answer your actual question, I am not. However, I strongly advise against simply... experimenting with this sort of thing on your own. You could easily start an electrical fire, damage your tablet, or do one of many other unpleasant things.
The surface uses a 7.4V battery pack. Attempting to connect a 5v supply to charge without a stepup converter will not charge the surface at all and instead lead to damaging your 5v supply as the surface attempts to supply 7.4v to it, which may also damage the surface too.
In electronics nothing is more important than the correct voltage. Take a raspberry pi, many people (including myself) use them for electronics projects. Both me and a friend added ultrasonic rangefinders to our pi's except I observed the fact that the module in question uses 5v signalling and my friend did not. We both connected the +5v and ground lines to the corresponding points on our pi's. I connected 1 GPIO pin of the pi to the trigger pin of the HC-SR04 (module in question) as did friend. The pi uses 3.3V on its GPIO pins, the HC-SR04 is safe with a 2.7v-5v trigger voltage, thats fine. The echo pin however is +5v. I ran my echo pin through a voltage divider to give myself a 3.0v output instead, this is safe for the pi (it is fine with 2.7-3.3). My friend did not, he connected the echo pin direct to the pi, tried to use the sensor for the first time ever and found he didnt get any results, do a few test lighting an LED and he found he could no longer use that pin, he damaged it.
I would be more concerned about the surface damaging the supply than the other way around in this case.
However you are correct in believing the device charges at 12V (2A or 4A depending on whether its an RT or pro charger). The surface would have the correct voltage regulation to charge the battery and supply 5v to USB internally. There are 4 wires within the charger. Red: +12V, Black: Ground, Blue: charger detection, Yellow: signal for the charger LED. I don't think its known *exactly* how the yellow and blue wires function, the surface does charge without them. Blue is rumored to be something to do with the RT charger identifying itself as a 2A device and telling the pro to not draw more than 2A (it draws 4) hence damaging the charger but exactly how it does that is unknown. If the 12V supply can supply more than 4A your golden. A supply capable of higher currents than the device needs is actually perfectly safe, current is drawn by the device from the supply, the supply does not force the current at the device. If a device requires 2A and the supply is capable of 10, the supply will still only give out 2 as that is all the device is drawing. A device drawing more than the supply can cope with on the other hand IS DANGEROUS. Lets say you were to use a 5v>12v converter from a computer USB port to charge a surface, a computers USB only supplies half an amp under normal conditions (and by shorting D+ and D- it can do 1.2), the pro trying to draw 4 combined with the inefficiency of 5v-12v stepup converters will cause a current draw so large on the USB port that I would be surprised if damaging the USB port is the only thing to happen, you could fry the entire USB controller easily, damaged USB controllers can then cause their own issues, primarily with this amount of current the possibility of fire.
If you want to charge your surface. Find a voltage source *above* 12V (but not too high above) and use a voltage regulator to make a clean 12V. Then make sure both the regulator and battery can cope with the current. The regulator will determine how far above 12V you are safe to go, some might safely regulate 100V to 12 (most likely a switch mode regulator which although highly efficient won't output a clean 12V, it will have alot of voltage drops and spikes) whereas some might only do 20 to 12 (linear, much less efficient but a very clean 12v, stick a heatsink on it as they get warm). A car battery btw is not 12V, its 12.6 nominal and when fresh off the charger as much as 14.
Yeah, I know the basics of electrical physics. Not a complete speculating noob.
P=IV
Power of a 5V 2.1A is 10.5w
The original power adapter supplies 12V at 2A = 24w
My physics may be rusty but, I was under the impression that it would still charge at lower power output albeit half as fast. I've read that the RT when using the 48W charger would still charge at the same rate as the 24w. This means that the current that is drawn by the surface rt is capped at 2A. Hence, I believe that there would be any significant issues with drawing 2A from a charger that is designed to output 2A anyway though at a lower voltage.
If I'm still wrong here, would love some clarification.
lambstone said:
Yeah, I know the basics of electrical physics. Not a complete speculating noob.
P=IV
Power of a 5V 2.1A is 10.5w
The original power adapter supplies 12V at 2A = 24w
My physics may be rusty but, I was under the impression that it would still charge at lower power output albeit half as fast. I've read that the RT when using the 48W charger would still charge at the same rate as the 24w. This means that the current that is drawn by the surface rt is capped at 2A. Hence, I believe that there would be any significant issues with drawing 2A from a charger that is designed to output 2A anyway though at a lower voltage.
If I'm still wrong here, would love some clarification.
Click to expand...
Click to collapse
The wattage is not of concern. The voltage and current must be treated independently. Also, consider that you are trying to charge a 7.4v battery from a 5v supply, the current will not flow from the 5v to the 7.4v (assume old electrical flow not real electron flow), it will end up doing the opposite, 7.4v flowing towards the 5v.
So lets say you step up the 5v to 12v, there is inefficiency in doing that. To get 12V @2A you would not be drawing 5v @2A, you would more likely be drawing 5V @4 or 5A.
As 6677 says, the Voltage and current (and not the power) are what matters. Power (Wattage) is a derived value (and you correctly gave the derivation); it's handy for making comparisons when certain core assumptions are valid but meaningless on its own. In this case, the core assumptions are *not* valid: not all chargers are producing the same Voltage.
To take another example of where assuming all Watts are created equal would get you in trouble, consider an incandescent light bulb. Light bulbs are rated in Watts because incandescent bulbs operate by turning electrical power (Watts) into heat (also measurable in Watts) via resistive filaments that can only dissipate a limited amount of heat. Thus, a 60W bulb has a filament designed to dissipate 60W of heat. However, since power is a derived value, let's look at what it actually means: In the USA (mains run at about 120V) we can use Ohm's Law: P=I*V; V=I*R, so I=V/R, so P=V*V/R=V^2/R, so R=V^2/P. Therefore, the resistance of a 60W bulb in the USA is (120^2)/60=2*120=240 Ohms. Cool. Now, let's take that 60W bulb from the USA and connect it to European mains. It should still consume 60W of energy and produce the same amount of light, right? After all, it says "60W" right on the box! Hmm... but European mains run at 240V. The resistance doesn't change; it's a physical property of the filament. So, using P=V^2/R, we get P=240*240/240=240W That's four times as much as the bulb is meant to handle; it will flash very brightly (much brighter than normal) and burn out instantly! The bulb isn't defective, it just wasn't made to handle 240V. The "60W" on the box is based on an assumption of 120V; the bulb itself does nothing to ensure that it only consumes 60W.
While the basics of Ohm's Law will equip you to understand things like I describe above, it is *not* enough (by itself) to make assumptions about complex electronics. Even a 101-level electrical engineering course will make that abundantly clear; there are many more forces at play in the world of electronics than the familiar and relatively-easily-understood Voltage, current, resistance, and power.
Thanks guys for clearing the air. Man, I thought it would have been possible to jerry rig a Surface Charging USB cable to be used with battery packs.
So from what I understand, to avoid unnecessary complications, it would be best to find a way to supply 2A @ 12V. Obviously finding a power supply that natively does so is most ideal. But there is an alternative of stepping up the voltage from 5V to 12V with will lead to losses due to conversion inefficiency which basically rules this out.
12V Power Bank
I found a 12V 2.5A external battery pack listed above. This seems to work doesn't it? It has a 12V output with a max draw of 2.5A. The Surface RT draws a maximum of 2A from your power source. I can simply splice the provided cables with the Surface Charging Cable and it sounds to me it'll work.
Since the Surface RT is reported to have a 31.5Wh battery, this battery pack seems to have a 78Wh battery making it rather ideal.
If it is indeed 12V @2A or more then yes, it should charge an RT
Use a power gorilla I use one for my macbook pro and other usb items
Just saw flyer for Surface pro/RT powerbank
Got an email with flyer for Surface + laptop Powerbank. I haven't put an order yet but you can check it out at -> qi-infinity website
It seems they already include Surface adapter and its on sale. I have Lenovo X1 carbon laptop and they have proprietary connector for that one included as well. So I might go with them

[Q] USB Cable Length and Charge Rate?

So I'm looking to purchase a slew of new chargers for myself and my wife for at home, office in the car, etc... She has a Galaxy S4 and Ipad3; I have Galaxy S3 and Galaxy Note 10.1 2014. I'm at least somewhat aware of the charger requirements and plan to supply at least 2 amps to each device in general.
My real confusion is in the USB cable length. It seems most stock cables are roughly 3ft long and from what I can tell, going with a longer cable increases resistance, causes a voltage drop and lowers the current/charge rate. Of course a higher gauge cable will help decrease this resistance, but I'm already planning ot go with 24AWG cables.
So my question is, how long of a USB cable can I use before it has a noticable effect on my charge rate? If anything over 3ft significantly slows it down, then I'll probably just use AC extension cables instead of longer USB cables when necessary. What cable length vs charge rate do you find acceptable?
Thanks in advance!
I have a 10 foot cord that I got from mobstub and I noticed a significant increase in the time it takes to charge. I'm not sure how much degraded it is but if there's an app or something that can show me that I'll tell you the stats.
Sent from my SPH-L720 using xda premium
low voltage / low current
I don't have any empirical data to back it up, but at 5VDC & 2A or less (typically), I would not expect much of a drop in charge voltage/current due to length (10 ft or less). That's just a gut feel from building lots of cables & electronic assemblies for the fun of it !!
Good luck & have fun !!
Mark J Culross
KD5RXT
mjculross said:
I don't have any empirical data to back it up, but at 5VDC & 2A or less (typically), I would not expect much of a drop in charge voltage/current due to length (10 ft or less). That's just a gut feel from building lots of cables & electronic assemblies for the fun of it !!
KD5RXT
Click to expand...
Click to collapse
If your cables are thin - don't even expect to deliver 2000mA on them. The thickness is the key here.
bmather9 said:
So I'm looking to purchase a slew of new chargers for myself and my wife for at home, office in the car, etc... She has a Galaxy S4 and Ipad3; I have Galaxy S3 and Galaxy Note 10.1 2014. I'm at least somewhat aware of the charger requirements and plan to supply at least 2 amps to each device in general.
My real confusion is in the USB cable length. It seems most stock cables are roughly 3ft long and from what I can tell, going with a longer cable increases resistance, causes a voltage drop and lowers the current/charge rate. Of course a higher gauge cable will help decrease this resistance, but I'm already planning ot go with 24AWG cables.
So my question is, how long of a USB cable can I use before it has a noticable effect on my charge rate? If anything over 3ft significantly slows it down, then I'll probably just use AC extension cables instead of longer USB cables when necessary. What cable length vs charge rate do you find acceptable?
Thanks in advance!
Click to expand...
Click to collapse
At the voltages, gauges, and distances we are talking here, you are way over thinking this. It will certainly be fine at least up until the USB data transmission cable limit of 5 meters.
http://forum.xda-developers.com/showthread.php?t=2545497
According to the results from this other thread the wire length really seems to make a drastic difference.
I've heard quite a few people say that voltage drop will be minimal, but also heard of people being unable to charge their ipad 3 with a 10 ft cable.
I'd really like to use longer cables since they are generally more convenient to use the phone while plugged in, but the charge rate is also very important. So I'd like to get a feel as to what I'd be sacrificing by using longer cables.
I'm certainly overthinking this; I'm an engineer...that's what I do
Regardless, I'm planning to purchase quite a few cables and figured I should do so with some intelligence.
So with 2 votes so far for "1.5 ft or less" does that mean that people are really using even shorter cables to get better charge rates?
wire size vs voltage drop
OK, in place of my "gut feel" in my earlier post, here's just one website that allows you enter the parameters of your cable (size in AWG gauge, voltage, length & current) & then calculates the theoretical voltage drop through your cable: http://www.powerstream.com/Wire_Size.htm (scroll down to the bottom to find the calculator). For example, according to their calculations, a three foot cable using 24 gauge wires carrying 2A current would impart a little over a 0.3VDC drop. If your charger is supplying 5VDC at the source, then only 4.7VDC would make it to your phone for use in charging at the full 2A rate. Contrast this with a ten foot cable of the same size under the same conditions which suffers a little more than a 1VDC drop, resulting in only 4VDC being available at your phone at the full 2A rate for charging.
However, I would not expect the phone to continue to try to draw 2A of current under these conditions (particularly the 10 foot cable), else charging may not take place at all if/when the voltage is too low. Instead, I would expect that the charging circuit on the phone would diminish its current draw (to something LESS than 2A) in an attempt to keep the voltage closer to the desired 5VDC (or whatever the spec'd minimum is to charge the specific battery, assuming that the charger itself is putting out a nearly constant amount of power, somewhere near to its rated number of watts).
It's very likely because of this reduction in current that your overall charging rate is reduced (or to put it another way, your overall charging time is increased) on lesser size cables, etc.
YMMV . . .
Good luck & have fun !!
Mark J Culross
KD5RXT
mjculross said:
OK, in place of my "gut feel" in my earlier post, here's just one website that allows you enter the parameters of your cable (size in AWG gauge, voltage, length & current) & then calculates the theoretical voltage drop through your cable: (scroll down to the bottom to find the calculator). For example, according to their calculations, a three foot cable using 24 gauge wires carrying 2A current would impart a little over a 0.3VDC drop. If your charger is supplying 5VDC at the source, then only 4.7VDC would make it to your phone for use in charging at the full 2A rate. Contrast this with a ten foot cable of the same size under the same conditions which suffers a little more than a 1VDC drop, resulting in only 4VDC being available at your phone at the full 2A rate for charging.
However, I would not expect the phone to continue to try to draw 2A of current under these conditions (particularly the 10 foot cable), else charging may not take place at all if/when the voltage is too low. Instead, I would expect that the charging circuit on the phone would diminish its current draw (to something LESS than 2A) in an attempt to keep the voltage closer to the desired 5VDC (or whatever the spec'd minimum is to charge the specific battery, assuming that the charger itself is putting out a nearly constant amount of power, somewhere near to its rated number of watts).
It's very likely because of this reduction in current that your overall charging rate is reduced (or to put it another way, your overall charging time is increased) on lesser size cables, etc.
YMMV . . .
Good luck & have fun !!
Mark J Culross
KD5RXT
Click to expand...
Click to collapse
Thanks for that explanation. It seems that even a 6 ft USB cable will significantly slow charging, and that a 10 ft even more so to the point that it may not even charge sometimes. So its looking like 3ft USB cables with AC extensions where necessary is the way to go. Maybe I'll try some 1.5 ft as well, but not sure how practical they will be for using the devices while plugged in, even with the AC extension.
If anyone has another opinion please voice it.

Categories

Resources