with the included Moto G USB cable to charge the phone? Obviously, the G doesn't come with a wall adapter so I was wondering if this will be OK - i.e. same voltage, etc. - and not fry the battery.
Thanks
terrapin69 said:
with the included Moto G USB cable to charge the phone? Obviously, the G doesn't come with a wall adapter so I was wondering if this will be OK - i.e. same voltage, etc. - and not fry the battery.
Thanks
Click to expand...
Click to collapse
Once i read that most new batteries (since 4 year ago to now) can use different chargers with higher amperage without being damaged, o i think you can. i think thats why they don't include charger, because most people already have one at home.
Fast charging causes reduction of long-term battery capacity
elestudiante said:
Once i read that most new batteries (since 4 years ago to now) can use different chargers with higher amperage without being damaged, so i think you can. i think thats why they don't include charger, because most people already have one at home.
Click to expand...
Click to collapse
My response here to the above answer is what I understand as a layperson who has recently done internet research into this issue of which power adapters would be ok to use with the Moto G. I will be happy to stand corrected by anyone who has contradictory authoritative information.
My understanding is based on the Battery University website entries on lithium ion batteries:
batteryuniversity.com/learn/article/charging_lithium_ion_batteries
batteryuniversity.com/learn/article/ultra_fast_chargers
Although the above quoted answer on using the Galaxy S4 charger with the Moto G is mainly correct (in that no direct damage to the phone circuitry or immediate damage to the battery will occur by using the higher amperage charger), there still will be a long-term negative affect on battery capacity by using a higher amperage charger.
I believe the S4 comes with a 2A charger. According to Motorola online support website
( motorola-global-portal.custhelp.com/app/answers/prod_answer_detail/a_id/97318/p/30,6720,9050/action/auth )
the Moto G will automatically restrict charging above 1500mA. So that would mean that a 2A charger would cause the Moto G to charge at the 1500mA rate. No damage would be done to the phone circuitry charging at the allowable rate of 1500mA, but the question remains if there would be a long-term reduction of battery capacity by charging at 1500mA for a year or more. The official Motorola charger sold online is now 1200mA. So the comparison should be between charging at 1200mA versus 1500mA.
According to the Battery University website, the optimal range to charge lithium ion batteries is between .5C and .7C. Lower charging rates result in less reduction over time of battery capacity. The C-rate unit is used to measure charging and discharging rates. A value of 1C is equal to the rated amperage of the battery. So, for the Moto G, 1C is equal to 2070mA. Therefore, according to this recommendation, the optimal charging range for the Moto G would be between .5 x 2070 = 1035mA and .7 x 2070 = 1449mA. So, charging at the Moto G's maximum of 1500mA would be just barely outside the optimal range.
But extrapolation from Figure 1 in the Battery University 'Fast and Ultra-Fast Chargers' article indicates that there would be an additional 9% reduction of battery capacity by charging at a 300mA higher rate of 1500mA over the official charger rate of 1200mA. This additional reduction in capacity of 9% would be over 500 charging cycles, or about 1.5 years of average usage. The normal reduction in capacity just from aging over 500 cycles is already listed as 16%, so adding the 9% would bring it to a total of 25% loss of battery capacity after about 1.5 years.
--------------------------------------------------
UPDATE and CORRECTION:
I have more authoritative information directly from Battery University that changes the conclusion I draw above based on my effort to extrapolate from the Battery University website article.
The extrapolation I did above was based on the additional loss of battery capacity cited when going from a 1C to 2C charging rate. But according to direct communication from Battery University, when charging at a rate below .7C there should be no measurable improvement to capacity by using slower charging rates. Charging above .7C would still be expected to add more stress to Lithium Ion Polymer batteries and likely add to long-term reduction of capacity.
So, what this means for the Moto G and Nexus 5 is that there should be no measurable difference between charging with 2A, 1.2A, 1A, or 850mA chargers as far as effect on long-term battery capacity goes. Both the Moto G and Nexus 5 are supposed to automatically restrict the charge rate at 1500mA even when using a faster charger, which is just at or below .7C for both phones. So, as long as the charger dependably keeps to 5V, a higher amperage 2A charger will be faster but pose no problem to long-term capacity.
Related
Is it dangerous to charge nexus S with nexus One Charger ?
nexus s charger is just 700 mAh but nexus one charger is 1000 mAh .
so it can charge nexus s faster , but can this issue be harmfull for nexus s ?!
THnX
It likely is no problem since the charging is controlled by the phone and the charger only supplies the voltage/current. Just make sure that the voltage is the same.
yes the voltage is the same , 5 Volts
but the current is different , you mean that just voltge is important ?
also I had an htc nexus one car charger (because I have N1 too) , its voltage is again 5 volts but the current is 2000 mAh !!
after I saw that current I bought a samsung car charger it's exacly 5 volts nd 700 mAh .
I'm not sure about nexus s , does it endure with 2000 mAh ?
if it is possible I thinks it will charge nexus s 2 or 3 times faster !
Your charger is a voltage source. That means that it will try to keep the voltage at the specified 5V. How much current comes out depends on the resistance of the circuit.
The 700mA is the maximum current you can take out of it before the voltage begins to drop.
There are quite some people who use stronger chargers, and thinking of the battery there should not be any problems until at least 1500mA (which would be 1C charging rate for the battery).
Considering the charging characteristics of LiIon batteries I don't know if it will reach full charge any sooner, but some people reported faster charging.
cgi said:
Your charger is a voltage source. That means that it will try to keep the voltage at the specified 5V. How much current comes out depends on the resistance of the circuit.
The 700mA is the maximum current you can take out of it before the voltage begins to drop.
There are quite some people who use stronger chargers, and thinking of the battery there should not be any problems until at least 1500mA (which would be 1C charging rate for the battery).
Considering the charging characteristics of LiIon batteries I don't know if it will reach full charge any sooner, but some people reported faster charging.
Click to expand...
Click to collapse
Thnks! I didnt know about this.
Shouldnt be a problem unless voltages are different, i'm charging mine with a nokia 5800 cable connected to a USB port in the keyboard. Slow as heck but it charges.
The formula was :
W = V x I x T
W refers to power created or usage.
V refers to voltage should be constant at DC 5 volt, for ac should 110 v or 220 v.
I refers to flow. The flow in and out.
T refers to time or duration.
As u can see bigger flow will increase the result coz multiplied by time.
So on my opinion, it will not damage ur device as long the voltage stays at 5 volt. And it will charge faster.
But remember one thing. Don't charge too long, coz more time will create more power which ur device can't stand. For ex. Dont Charge overnight.
CMIIW
Sent from my Nexus S I9020T
rejanmanis said:
The formula was :
W = V x I x T
W refers to power created or usage.
V refers to voltage should be constant at DC 5 volt, for ac should 110 v or 220 v.
I refers to flow. The flow in and out.
T refers to time or duration.
As u can see bigger flow will increase the result coz multiplied by time.
Click to expand...
Click to collapse
You are oversimplifying. What you are saying is true only for an ideal ohmic load resistor. Your phone is anything but that. In our case the current I is dependent on the time (among others) and a non-trivial function.
So actually the power used/transmitted is
W = integral( V * I(t) * dT )
assuming that V is constant. I depends on the charging circuit, the charger, and how full the battery is at the moment. If you hit the current limit of the charger, then even V is not constant anymore, but a (unknown) function of I.
So on my opinion, it will not damage ur device as long the voltage stays at 5 volt.
Click to expand...
Click to collapse
This is likely true, assuming, that the charging circuit in the phone limits the current to a level safe for the whole power train. That would not only be the battery itself, but also the cable, the connector, the traces on the circuit board, the charging transistor, the inductor (assuming a switched mode charging, not sure whether it is) and so on.
And it will charge faster.
Click to expand...
Click to collapse
That depends on the charging circuit and the battery protection circuit.
But remember one thing. Don't charge too long, coz more time will create more power which ur device can't stand. For ex. Dont Charge overnight.
Click to expand...
Click to collapse
You are still stuck in the ages of constant current, dumb lead-acid battery chargers. Nowadays we have "intelligent" chargers that monitor the state of the battery to ensure that doesn't go up in smoke by overcharging or to high charge rates.
Any sane design is current limited by the on-board charging circuit. I don't see why Samsung should have built anything else.
I usually let it charge over night. However I found today it is charged very fast: less than 2hours from 30% to 100%. I used galaxy note 2 usb charger. Do you have the same experience of charging or my battery has problems?
yes i same with you
When your battery was at 30%, it means the charger had to charge 1610mA (30% of 2300mA is 690mA). The Galaxy Note 2 charger has an output amperage of 2000mA. So you can imagine it won't take very long.
[update] Hm I misread note for tab. I have a tab 2 with a 2A charger. Not sure what the note 2 charger can output, but I'm guessing it will be above average.
Petrovski80 said:
When your battery was at 30%, it means the charger had to charge 1610mA (30% of 2300mA is 690mA). The Galaxy Note 2 charger has an output amperage of 2000mA. So you can imagine it won't take very long.
[update] Hm I misread note for tab. I have a tab 2 with a 2A charger. Not sure what the note 2 charger can output, but I'm guessing it will be above average.
Click to expand...
Click to collapse
Thanks. I will check the output of note 2 usb charger and do the math.
Stock Nexus 5 charger also charges it from 0 to 100% in less than 2 hours.
Dont forget that the devices kernel determines how much mA is drawn from a charger and not how much may a charger is rated for
-----------------------
Sent via tapatalk.
I do NOT reply to support queries over PM. Please keep support queries to the Q&A section, so that others may benefit
Not all milliamps are the same
It seems to be a common misconception that the number of milliamp-hours of your battery and the milliamp rating of your charger have a fixed relationship.They don't. It does not automatically follow that a 2000mAh battery will take 2 hours to charge from a 1000mA charger, or that the charge current will be 1000mA. Charge current can easily - and safely - be higher than the mA rating of the charger. Or lower.
The N5 battery is rated at 3.8V 2300mAh (typical) and, crucially, 8.74 watt hours. A 5V 1000mA charger can supply a maximum of 5 watts (5 volts x 1 amp). Voltage converters within the N5 change this 5 watts of power from 5V to 3.8V to suit the battery - and this could be at about 1250mA (assuming a not-unreasonable 95% conversion efficiency).
The battery voltage varies with the state of charge, reaching about 4.2V when fully charged. Even then, the charge current could be as high as 1130mA without drawing more than 1000mA from the 5V charger.
An earlier poster pointed out that charging is under control of the CPU (I suspect instead a dedicated charging circuit but that's irrelevant) and it is very likely that a) the charging current varies significantly during the charging cycle and b) it is unlikely that the charging circuit demands precisely the maximum that the charger can supply. But it is quite likely that the actual current being put into the battery is numerically higher than that being drawn from the source. It's the power in watts that counts, not the number of milliamps.
Batteries are not perfect, meaning you don't get out all you put in. If the battery was completely flat you would have to put in more than 8.74wh in to bring it up to full charge (although a totally flat Li-ion battery is dead beyond redemption; the battery life shown on the screen is the useable life, not ultimate power capacity).
Sometimes the charger rating, battery capacity and charge time seem to line up, but that's more due to a happy accident than anything else. A 40,000mA charger won't juice your phone from flat in four minutes!
Batteries, and charging, are complex...
G1MFG said:
It seems to be a common misconception that the number of milliamp-hours of your battery and the milliamp rating of your charger have a fixed relationship.They don't. It does not automatically follow that a 2000mAh battery will take 2 hours to charge from a 1000mA charger, or that the charge current will be 1000mA. Charge current can easily - and safely - be higher than the mA rating of the charger. Or lower.
The N5 battery is rated at 3.8V 2300mAh (typical) and, crucially, 8.74 watt hours. A 5V 1000mA charger can supply a maximum of 5 watts (5 volts x 1 amp). Voltage converters within the N5 change this 5 watts of power from 5V to 3.8V to suit the battery - and this could be at about 1250mA (assuming a not-unreasonable 95% conversion efficiency).
The battery voltage varies with the state of charge, reaching about 4.2V when fully charged. Even then, the charge current could be as high as 1130mA without drawing more than 1000mA from the 5V charger.
An earlier poster pointed out that charging is under control of the CPU (I suspect instead a dedicated charging circuit but that's irrelevant) and it is very likely that a) the charging current varies significantly during the charging cycle and b) it is unlikely that the charging circuit demands precisely the maximum that the charger can supply. But it is quite likely that the actual current being put into the battery is numerically higher than that being drawn from the source. It's the power in watts that counts, not the number of milliamps.
Batteries are not perfect, meaning you don't get out all you put in. If the battery was completely flat you would have to put in more than 8.74wh in to bring it up to full charge (although a totally flat Li-ion battery is dead beyond redemption; the battery life shown on the screen is the useable life, not ultimate power capacity).
Sometimes the charger rating, battery capacity and charge time seem to line up, but that's more due to a happy accident than anything else. A 40,000mA charger won't juice your phone from flat in four minutes!
Batteries, and charging, are complex...
Click to expand...
Click to collapse
This. Well said.
Your suspicions are correct, it does have a dedicated charging circuit. This chip is responsible for charging. Input current appears to be capped at 1200mA. Measured with my DMM last night and never saw the phone draw more than 960mA when charging with the screen off. It stayed like that until the battery was around 95% charged, then gradually tapered off from there as the battery reached 100%.
G1MFG said:
It seems to be a common misconception that the number of milliamp-hours of your battery and the milliamp rating of your charger have a fixed relationship.They don't. It does not automatically follow that a 2000mAh battery will take 2 hours to charge from a 1000mA charger, or that the charge current will be 1000mA. Charge current can easily - and safely - be higher than the mA rating of the charger. Or lower.
The N5 battery is rated at 3.8V 2300mAh (typical) and, crucially, 8.74 watt hours. A 5V 1000mA charger can supply a maximum of 5 watts (5 volts x 1 amp). Voltage converters within the N5 change this 5 watts of power from 5V to 3.8V to suit the battery - and this could be at about 1250mA (assuming a not-unreasonable 95% conversion efficiency).
The battery voltage varies with the state of charge, reaching about 4.2V when fully charged. Even then, the charge current could be as high as 1130mA without drawing more than 1000mA from the 5V charger.
An earlier poster pointed out that charging is under control of the CPU (I suspect instead a dedicated charging circuit but that's irrelevant) and it is very likely that a) the charging current varies significantly during the charging cycle and b) it is unlikely that the charging circuit demands precisely the maximum that the charger can supply. But it is quite likely that the actual current being put into the battery is numerically higher than that being drawn from the source. It's the power in watts that counts, not the number of milliamps.
Batteries are not perfect, meaning you don't get out all you put in. If the battery was completely flat you would have to put in more than 8.74wh in to bring it up to full charge (although a totally flat Li-ion battery is dead beyond redemption; the battery life shown on the screen is the useable life, not ultimate power capacity).
Sometimes the charger rating, battery capacity and charge time seem to line up, but that's more due to a happy accident than anything else. A 40,000mA charger won't juice your phone from flat in four minutes!
Batteries, and charging, are complex...
Click to expand...
Click to collapse
Thanks a lot. It did look complicated. As long as the fast charging is normal, I don't worry too much.
Can anyone recommend an app that shows real time current draw? It would also be cool if the app showed how much power the phone is using in real time.
Sent from my Nexus 5 using Tapatalk
G1MFG said:
It seems to be a common misconception that the number of milliamp-hours of your battery and the milliamp rating of your charger have a fixed relationship.They don't. It does not automatically follow that a 2000mAh battery will take 2 hours to charge from a 1000mA charger, or that the charge current will be 1000mA. Charge current can easily - and safely - be higher than the mA rating of the charger. Or lower.
The N5 battery is rated at 3.8V 2300mAh (typical) and, crucially, 8.74 watt hours. A 5V 1000mA charger can supply a maximum of 5 watts (5 volts x 1 amp). Voltage converters within the N5 change this 5 watts of power from 5V to 3.8V to suit the battery - and this could be at about 1250mA (assuming a not-unreasonable 95% conversion efficiency).
The battery voltage varies with the state of charge, reaching about 4.2V when fully charged. Even then, the charge current could be as high as 1130mA without drawing more than 1000mA from the 5V charger.
An earlier poster pointed out that charging is under control of the CPU (I suspect instead a dedicated charging circuit but that's irrelevant) and it is very likely that a) the charging current varies significantly during the charging cycle and b) it is unlikely that the charging circuit demands precisely the maximum that the charger can supply. But it is quite likely that the actual current being put into the battery is numerically higher than that being drawn from the source. It's the power in watts that counts, not the number of milliamps.
Batteries are not perfect, meaning you don't get out all you put in. If the battery was completely flat you would have to put in more than 8.74wh in to bring it up to full charge (although a totally flat Li-ion battery is dead beyond redemption; the battery life shown on the screen is the useable life, not ultimate power capacity).
Sometimes the charger rating, battery capacity and charge time seem to line up, but that's more due to a happy accident than anything else. A 40,000mA charger won't juice your phone from flat in four minutes!
Batteries, and charging, are complex...
Click to expand...
Click to collapse
True. I never said there was a fixed relationship though. They do have a loose relationship. Charging with a 500mA charger will take longer than charging with a 2000mA one, since about every modern phone accepts a charging limit higher than 500mA.
Another aspect not addressed in my reply is that the charge process isn't linear. But without going into too much electronics, I just wanted to explain to the OP he shouldn't have to worry if he notices differences in charging times when using chargers of different amperage output.
Today's batteries are much improved
wolfca said:
Thanks a lot. It did look complicated. As long as the fast charging is normal, I don't worry too much.
Click to expand...
Click to collapse
That's the ticket. When used with the correct charger, a modern phone battery takes a couple of hours to charge fully, a bit longer with a lower-rated charger. Or you can top up a bit if you have a few minutes spare. It's much better than the early mobiles with Ni-Cd batteries that took overnight to charge. And required weightlifting training before you could even pick them up!
Will this charger work for Moto G 1st gen? If this will work, this will be the fastest charger for Moto G.
http://www.nokia.com/global/products/accessory/ac-60/
That charger is a classic, one of the best you can buy.
But being "fastest" is not very accurate to say the least.
Can a higher mA rated charger cause battery damage in the long run?
on the motorola site, they say the moto G is compatible with chargers ranging between 0.5A to 1.5A. and it will restrict current from higher rated chargers.
is there a chance that using higher rated chargers like those of an iPad can cause deterioration to battery life on long term usage?
They do give significantly faster charging times and i've not noticed any heating issues while charging.
√one said:
on the motorola site, they say the moto G is compatible with chargers ranging between 0.5A to 1.5A. and it will restrict current from higher rated chargers.
is there a chance that using higher rated chargers like those of an iPad can cause deterioration to battery life on long term usage?
They do give significantly faster charging times and i've not noticed any heating issues while charging.
Click to expand...
Click to collapse
The phone and battery circuits will throttle the current to 1.5A even if the charger can provide 2A or more current.
In other words, getting a charger between 1A-1.5A is optimal, but using higher current one like 2A won't hinder your battery performance/life.
liveroy said:
The phone and battery circuits will throttle the current to 1.5A even if the charger can provide 2A or more current.
In other words, getting a charger between 1A-1.5A is optimal, but using higher current one like 2A won't hinder your battery performance/life.
Click to expand...
Click to collapse
OK. thanks.
i recently bought the P900 (wifi version).
full charge will take around 5 hours, which in practice translates to 4 hours (i never get to 0% and charging from 90%\95% and on will be slowed down by the device anyway).
is there any way to speed up the charging?
like buying a 5.3V 3A charger. will the OEM cable be able to transfer the additional current?
could the device even take advantage from a 3A charger?
if so, can you recommend on any?
its important to me because i always use 100% brightness.
No. In the past mobile devices (mostly phones) shipped with cheap 500ma chargers and bumping up to higher amperage chargers would have an affect on charge time. Those days are gone as charging efficiency of chargers and cost to produce have lead to included chargers being optimized for charging times. Charging circuitry in the devices is going to take what it's rated to take and no more, so once a charger is plugged into it that's rated the same as the device is designed to take there's little else that can be done to speed up charging.
Bottom line - the charger that came with the tablet if it's the official one (i.e. if you bought new, not used and someone included the wrong one) is optimized to charge the tablet at the fastest rate. Based upon the numbers you noted your charge times are not excessive, the tablet is designed to take around 2A and it won't take 3A even if the charger is rated for it.
If you want faster charging you need to sell your tablet and get a Snapdragon variant instead (LTE tablets from various carriers) or start practicing better battery management to reduce how depleted your tablet gets. For me that means not running at highest brightness unless I really need it and topping off the battery whenever I can. When I get really low and I have a reasonably long period that I can charge I'll sometimes shut the tablet completely down rather than put it to sleep so that charging is accomplished with near zero load on the battery.
oh, bummer.
well, i guess i would have to learn how to live with that.
TY for your reply.
im planning on buying a 2 port charger so i wont have to carry so many stuff with me,
how much slower the device will charge with a 5.0V charger?
should i look for a 2 port 5.3v charger? a normal device wont have troubles with that?
It's not the voltage it's the amps. If you want to charge two devices simultaneously as quickly as possible the power supply needs to be rated to output the wattage necessary to provide the amperage the devices will draw for maximum charge rate.
My recommendation is to find something capable of over 20 watts (2A x 5V = 20watts). I'd buy this for future Qualcomm quick charge use.
https://www.anker.com/products/A2031111
Sent from my SM-P900 using Tapatalk
my question was how much slower the note pro will charge with a 5.0v 2A charger as opposed to the OEM one which is 5.3v 2A.
and if there is any problem to use a 5.3v charger with a normal smartphone.
charging the note pro is more important to me than my other devices.
Yonany said:
my question was how much slower the note pro will charge with a 5.0v 2A charger as opposed to the OEM one which is 5.3v 2A.
and if there is any problem to use a 5.3v charger with a normal smartphone.
charging the note pro is more important to me than my other devices.
Click to expand...
Click to collapse
Yes but you also noted that you want to buy a 2 port version and I'm saying that the voltage is only part of the equation. Unless you are already aware that you need one rated at 2A simultaneously (you didn't specify). I honestly never measured between the two, I do not worry about 5V vs 5.3V since the charging voltage of the lithium ion cells is under 5V anyway. AFAIK the current is more critical. Maybe someone else more knowledgeable in electrical engineering can chime in since I'm unsure how the charging circuit within the phone will step down the voltage from the charger to the battery. All I know is if one tops off regularly or charges overnight there's no night and day difference between the stock 5.3V charger and a 5V one so long as the aftermarket one is rated 2A or more.
Sent from my SM-P900 using Tapatalk
So when you buy a Moto G Stylus (2021), it comes with a 10W charger. And 10W that is sort of what is cagily listed on the Moto website for this device. But when I plug it into a QC3 charger, I get about a 14W charging rate (5V x 2.8A). Does anybody know the maximum charge rate for this device, and specific charger models that can provide it? Would a USB-C PD charger at a higher wattage be able to charge at a faster rate? Thanks in advance for your comments on this matter.
To answer your question, yes a USB C PD adapter would provide faster charging. So long as it's QC 3.0 or higher. 18W or 30W should be fine, I believe the phone input charge maxes out at 15 Watts (absolutely no permission to quote me on that lol) so if you don't mind a bit of heat and the potential of degrading your battery slightly faster than with charging on a standard 1A, 2.4V charger, then the 18W or 27W USB C wall adapter that is compatible with QC 3.0 or 4.0, should be sufficient. Don't forget to grab a couple good grade USB C to C cables as they are often the first thing to go bad and prevents turbocharge from kicking in.
Thanks for the comments, @mario0318 So if I am currently seeing 14W (5V x 2.8A) with a QC3 charger, it sounds like I may be near the max already if it is only 15W. I have no USB-C PD chargers yet that I can use to test, but there was a 25W Belkin model on sale today (for Black Friday) for just $10 so I ordered myself one. When it comes in, I'll test it versus the QC3 charger to see if there is any significant difference.
So I have a basic update here. The QC3 charger I mentioned has an LED readout on it, and that is where I got the estimated 14W charge rate (as 5V x 2.8A). The new 25W Belkin charger I got does not have an LED readout for V & A on it, however. So I turned to the Ampere app on the Play Store. Then I swapped back and forth between the two charging systems and watched the estimated charge rate on Ampere. The 25W Belkin charger definitely shows higher charge rates according to the Ampere app. But I've ordered myself a USB C charge meter (like the old USB "doctor" meters, but with USB C connections) from China to document it more closely.
I might be missing something, but one thing I see lacking with the Ampere app is logging capability--it seems like its strength is just showing rates in real time. It would be cool to find an app that can not only monitor in real0time, but also log charging events with V & A stats, etc. I see AccuBattery may potentially provide this. Or any suggestions out there for another battery charge monitoring app that you think might do the trick?
For those potentially interested in the 25W Belkin charger, the specific model is the "WCA004dqWH", and it is on sale now for $10. It is actually mentioned in a news snippet here at XDA:
https://www.xda-developers.com/belkin-usb-c-25w-charger-deal-november-2021/
I think the Battery Manager app by 3C allows recording logs for power charging events. But I forget if there's a limit with the free app compared to the paid/donate unlocked features.
Regarding the charger wattage, I'm fairly sure anything past 25W would be over kill for charging a single device like the 2021 moto g. At that point it becomes more suitable for two devices, with anything far higher like 60W or 85W being totally unnecessary and potentially harmful.
Thanks again, @mario0318 , for your new comments. I agree that anything beyond 25W would be overkill for this phone.
As a further update, I decided to swap over to AccuBattery, and upgrade to the Pro version. As my Stylus was already charged, I decided to try the two chargers with a Nord N10 5G that had arround a 40% charge. The QC3 charger was charging at an average of 1993 mA with the screen off after I left it sit for a few minutes. When I swapped to the 25W Belkin sytem, it jumped to 2993 mA under the same scenario so like a full 1 Amp difference. These are about the same differences I noticed between the two chargers when charging my G Stylus (2021), but I did not want to say that above because they were off-the-cuff observations. But I took screenshots with AccuBattery this time so no apprehension in stating values this time around. I'll do the same with my G Stylus next time it needs a charge.
AccuBattery suggests only charging up to 80% capacity vs. 100% capacity given the wear and tear difference on the battery. I guess I'll try that, but in the long run, replacing the battery on the G Stylus (2021)--if it ever becomes necessary--looks pretty doable based on teardown videos.