Hi,
Need some advice here. Two phones for sale, both same brand/model however two separate specs and I need the one that will draw the least battery as I use the phones screen time for 8 straight hours a day (satnav) so the less battery draw the better. I do not care about anything else!
1) Snapdragon 625 - 14nm but 5.5" 1080p Screen
2) Snapdragon 435 - 28nm But 5.0" 720p Screen.
So the first phone has the newer 14nm CPU however has a bigger screen area (5.5") and Full HD.
However the second phone has a smaller screen area at 5.0" and is running less resolution @ 720 but has the older 28nm CPU which draws more power.
So which setup, in theory, would use less power?
NokiaBricks said:
Hi,
Need some advice here. Two phones for sale, both same brand/model however two separate specs and I need the one that will draw the least battery as I use the phones screen time for 8 straight hours a day (satnav) so the less battery draw the better. I do not care about anything else!
1) Snapdragon 625 - 14nm but 5.5" 1080p Screen
2) Snapdragon 435 - 28nm But 5.0" 720p Screen.
So the first phone has the newer 14nm CPU however has a bigger screen area (5.5") and Full HD.
However the second phone has a smaller screen area at 5.0" and is running less resolution @ 720 but has the older 28nm CPU which draws more power.
So which setup, in theory, would use less power?
Click to expand...
Click to collapse
see
What's your next smartphone / What should I buy by poseidon5213
and
**DEVICE SUGGESTION THREAD** -- Not sure what device to buy? Ask here! by KidCarter93
Sent from my XT1060 using XDA Labs
Hi, my question was a specific one regarding power draws, not sure if either of those threads are ideal for the question at hand.
NokiaBricks said:
Hi, my question was a specific one regarding power draws, not sure if either of those threads are ideal for the question at hand.
Click to expand...
Click to collapse
It comes down to usage I suppose. If the task at hand requires a lot of processing power but is less taxing on the screen/GPU then the SD 625 is better. If its the other way around then choose the device with the SD 435.
What about battery capacity, are those the same as well?
Freewander10 said:
It comes down to usage I suppose. If the task at hand requires a lot of processing power but is less taxing on the screen/GPU then the SD 625 is better. If its the other way around then choose the device with the SD 435.
What about battery capacity, are those the same as well?
Click to expand...
Click to collapse
Yup both phones have the same battery capacity, they are both identical in most respects, one is the note version the other is not.
The usage is 100% satnav (Waze) no other apps will be running. Any ideas on the usage of Waze?
NokiaBricks said:
Yup both phones have the same battery capacity, they are both identical in most respects, one is the note version the other is not.
The usage is 100% satnav (Waze) no other apps will be running. Any ideas on the usage of Waze?
Click to expand...
Click to collapse
I'd go for the newer/more efficient 625. As things such as GPS, Mobile Data and WiFi usage are embedded in the SoC. Screen brightness and density can be easily be adjusted to reduce power consumption. And you can also underclock the CPU (or turn on some sort of power saver mode) to squeeze out as much juice out of the battery as possible. The difference between a 720p and 1080 p device isn't that much anyway. But the improvements of the 625 over the 435 is far greater.
Related
Hi,
Does anyone know if there will be any perceivable difference in battery life between the latest batch of smartphones based on single core 45nm SoCs and the new 45 nm dual core versions.
All manufacturers state state improved performance AND lower power consumption with the dual cores, but I am wondering how this will affect my day-to-day battery life. I actually read somewhere that e.g. Tegra 2 phones may drain the battery quicker instead of saving power.
I was planning to buy the Desire Z or Galaxy S, but I may also wait for their upgrades if this will mean better battery life. I am not too eager about the performance improvements as I am not a gamer and will likely not feel the benefits (I mostly browse 3G or WiFi/use GPS/occasionally may play a movie + a couple of calls a day).
I'm expecting the battery life to be a bit low seeing that they have a "lot" of things in them.... I reckon it maybe the same as current smartphones?
Apparently battery times will be better, let´s see...
what i have learned that dual cores will save energy. (well atleast tegra 2) that soc has sth like 7 different cores if i remember right, each and every one of them made for a specific purpose (audio play/ video encode/ decode/ gpu....a nd other things i dont remember) so the soc it self will use only what YOU need at that very moment, lets say ur watching a vid, so only the video decode core will work whereas other should be in standby or sth.
(imo battery life should increase cuz of this, not sure how much tho)
and excuse me if my knoweledge isnt exatcly correct on this matter
There are two train of thoughts here:
1) as the die gets smaller (65nm[1st gen snaps] to 45nm[2nd gen snaps] to 40nm [tegra2] to 28nm[3rd gen snaps], etc), the processor tends to draw less energy. So yes, most dual cores (having a smaller die) SHOULD be more energy efficient.
2) With dual core, executions get carried out a lot faster than they were on single core. As evident by the benchmark done on dual cores (+2000 and up), it'll take less time to start/process a program and UI, in theory, should be a lot faster and smoother (that is if manufacturers don't start ****ing around with a UI and make it sluggish). Being that it's faster and a lot more versatile than single core, people will tend to youtube a lot, play games a lot, and generally use a lot more multimedia applications. Being that we don't have a self-sustaining energy source that isn't radioactive, the battery will drain from excessive use.
So at the end of the day, it depends on your use. Yes, dual cores are more energy efficient than single core, but in the hands of a 15yo teenage girl with more life than paris hilton, they probably get the same battery life as any other phone out there.
Hi guys,
thanks for the comments and predictions. My prediction would be that there would be hardly any noticeable change between the single & dual core 45 nm chips (given same usage of course), similar to how there was almost no change when switching from 65nm to 45nm chips - which are more energy efficient as well.
the better energy efficiency seems to be quickly soaked up by more power hungry hardware and software. so it all boils down to whether one needs the better performance as the battery life will likely stay the same.
But this of course is only a prediction based on past observation. I hope I am wrong and I am still considering if it is worth waiting for the dual core devices to hit the market. In the meantime if anyone has had a chance to play with such a device (tablet?), any additional info will be welcome
So I bought the application
CPU usage and Frequency Monitor (dual core support)
It's on the market, and you can find its thread here at xda:
http://forum.xda-developers.com/showthread.php?t=1160319
The dev says there a limitation for the Sensation, but I bought it anyway:
****** NOTE ******
HTC Sensation Owners
There is a device limitation with reading the CPU Frequency. I am looking into a workaround for this problem.
****** NOTE ******
Click to expand...
Click to collapse
I asked the question in the dev thread, hopping for a fast answer,but want to ask here too:
Can we trust the CPU usage, individual for each core, that is displayed by the app????
This app displays, at the top of the screen, in the notification bar, what core is used, with one column for each core
So with our asynchrone dual core, one of the column is often empty, when the other can be half full, or full, in normal use, this seem to be OK (even if the used freq for each core is not read, as the dev says, but is the usage correctly read??)
But I noticed the second column, so second core, very often starts filling too!
I used to think Android 2.3 is not supposed to handle dual core, so that almost all the time, only one is used????
That was for me, and for what I've read, the big reason why we have really bad scores on every benchmark?
If CPU monitor is right, I can see the second core easily waking up when the first one is already full, sometimes just a little, sometimes 50%, sometimes 100%
INCLUDING DURING BENCHMARKS where CPU monitors displays both cores running at 100%!!!!
So what is true here? Is CPU monitor fooled by Android 2.3 and shows the second core waking up when it's not?
Or does indeed our second core easily wake up, including in benchmarks, meaning our pittyful scores will never be greatly improved since both cores already release their power??
Need more infos on these asynchrone dual core, the way they work and are supported by Android 2.3, what HTC did to implement this, etc
Not a single answer from a dev of someone with more knowledge than me concerning dual core architecture and the way Android can handle it?
I had an answer from the dev of CPU usage monitor:
The CPU usage information is abstracted in both cases at the App level. Apps just need to read the standard CPU usage information at the OS level to gather its data. Control of when and how the dual core magic works is not a worry at the app level since the OS handles it. Hope this helps.
Click to expand...
Click to collapse
So...........
=> In normal use, the app shows one core only running, very low if no app running, sometimes when the first one is full, the second one starts working a little after the first one is full, 10%, 25%, etc, for apps requiring a little more power, everything seems very logical for an asynchrone dual core CPU (wasn't I told that Android 2.3 doesn't really manage async dual core???? When we overclock, don't we overclock only one core?)
But when doing a Benchmark, or playing heavy openGL games, the app displays that both core run at 100%, CPU at its max power for both cores!
So if it is, even with a better rom once S-OFF or better drivers, our bench scores will always be very low
I need this to be confirmed or not, if we already have both cores running at 100% during bench or openGL games, we can't expect much more from our Sensation :-(
I'd be interested to understand this also.
I appears that the Gingerbread doesn't support Dual-Cores properly.
Have a look at: http://groups.google.com/group/android-platform/browse_thread/thread/b754408b9af17e55?pli=1
I guess we need an upgraded Kernel and associated libraries. I must admit I was surprised when I started looking.
I'm looking for an Android Stick for mobile, battery driven operation.
Can this be done?
Graphics hardware is not strictly needed - so could be disabled to save energy.
You can use any TV-Stick and power it with a portable power bank like this one:
http://www.amazon.com/6000mAh-Portable-Rapid-Recharge-External-adapters/dp/B00EF1OGOG/ref=sr_1_4?ie=UTF8&qid=1387725779&sr=8-4&keywords=powerbank+5v
There are also ones that run with AA batteries if you prefer that. The only thing you have to keep in mind is that the output curren has to be appropriate. For single-core sticks, 1A will do, but for dual or quad core you should look for a power bank which outputs at least 2A.
But i don't really get why someone would need a battery powered tv stick. You need power for your TV, so why can't you use that for the Stick also?
I don't need a screen for this. I want to disable the graphics chip (or even rip it out). How much power do you think does the typical GPU consume, compared to the CPU? (Or all everything else?)
If such an Android stick is getting hot, is it rather the CPU or the GPU?
These devices (and i think all other devices that run Android) are all based on SoCs (System on a chip). This means that CPU and GPU is the same chip, so sorry, no ripping out of the GPU .
Power consumption depends on the task that is performed by the chip, and so does the the heat generation.
There should be a difference in power consumption between a) not performing any graphics related tasks and b) putting the GPU, the video memory and the HDMI port all into suspend mode - similar to sending a smartphone into suspend mode, while keeping the CPU running.
How much power saving b) will amount to, I don't know. Will it be closer to 5%? Or closer to 30%?
The portable power bank you linked to above is a little big for my use case. I would prefer to use a smartphone battery to power, say, a single core device. Do you think this would be impossible?
Android HDMI sticks are USB powered, so they need 5V. If you have some knowledge about electonics you can use any battery you want. You just have to build a circuit that converts the voltage of the battery (smartphone batteries have 3.7V) to 5V.
The power consumption is very hard to estimate. I would say a single core with idling GPU will consume about 0.3 - 0.5 Ampere depending on the CPU load.
But if you want a small android device powered with a smartphone battery, why don't you use a smartphone? What's the advantage of a hdmi stick?
DaPhinc said:
What's the advantage of a hdmi stick?
Click to expand...
Click to collapse
I do not need a screen. So this is an advantage. I hope to find a battery that fits hush with the board. I like to carry the device around in a pocket. In addition to suspending the graphic hardware, I will try to throttle down the CPU, in order to make it consume less energy.
Do you think I will be able to get the power consumption down to levels of an average smartphone? Or are there fundamental differences that make this impossible? In which case I would indeed need to look for a smartphone SOC based solution.
The SoCs in these sticks are also used in cheap smartphones, so the power consumption will be about the same. If you want to underclock you CPU you should look for a device that has custom kernels, because most stock kernels do not allow over- or underclocking.
DaPhinc said:
look for a device that has custom kernels
Click to expand...
Click to collapse
May I ask for HW suggestions, pls? ATM, I have no clear idea what would be the best hardware/vendor relative to my requirements.
Availability of custom kernels would be good. Availability of usable kernel sources would be even better. A big plus would be availability of 4.3.x - due to improvements in the Bluetooth stack, among other things. Any chance to run pure AOSP on any of these stick devices?
Thx.
I am quite sure that there is no device that meets all your requirements.
I know that there are custom kernels and even a (bad) cyanogen mod build for the mk808, but it is a dual-core, not a single core as you wanted.
And Android 4.3 is not available for any of these sticks. Maybe it will come as an update for the current generation of quad core sticks, or maybe not until the next generation (mid 2014) arrives.
How come my old HTC desire HD phone running gingerbread gets better battery life on airplane mode than my Nexus 5 running KITkat on airplane mode ? Is that normal ?
michaelopolis said:
How come my old HTC desire HD phone running gingerbread gets better battery life on airplane mode than my Nexus 5 running KITkat on airplane mode ? Is that normal ?
Click to expand...
Click to collapse
Well, it depends what you call "better battery life". How much time did you get on both phones ?
This said, you're not in the good section, you should have asked this in Q&A
Because it's a 1ghz single core phone?
Sent from my Nexus 5 using Tapatalk
Ben36 said:
Because it's a 1ghz single core phone?
Sent from my Nexus 5 using Tapatalk
Click to expand...
Click to collapse
Better yet, because the N5 has a huge screen?
Sent from my Nexus 5 using Tapatalk
michaelopolis said:
How come my old HTC desire HD phone running gingerbread gets better battery life on airplane mode than my Nexus 5 running KITkat on airplane mode ? Is that normal ?
Click to expand...
Click to collapse
My guide to checking which phone uses more battery than the others :
- CPU cores / threads, how much core does both phone have ? Lesser the core, lesser battery consumption. Add a point for every cores the phone have.
- CPU frequency / clock, does both phone clocks in at the same frequency ? Lesser the frequency, lesser the battery consumption. Add a point per GHz, so 1 GHz is equal as one point.
- GPU clock, does both phone have equal clock / frequency ? Lesser the frequency, lesser the battery consumption. Add a two point if the phone have superior GPU frequency.
- Screen (overall screen size), does both phone have an equal screen size ? Although this doesn't matter much, add a point if the phone have a larger screen.
- Screen, does both phone have an equal resolution ? If so, does the screen have an equal screen size ? If both points are true, smaller the screen and resolution, the lesser the battery consumption. This is one primary factor of battery drain. Add four points if a phone have a resolution more than HD (1280 x 720p), and two points if it's lesser than that.
- Device sensors, does one phone have a sensor and the other don't ? Add a point for every sensor you found, use a sensor detector app to check this.
- Operating system, does one phone have a superior operating system version ? Add a point if a phone have a superior OS version.
Sum all of those value, and compare it. The lesser the value, means the phone is either worse in terms of features, it performs worse than the others, but good thing it has a better battery !
Larger the value means the phone is full of features, and have a powerful pack of hardware, able to push more pixels. This at the cost of battery life.
I am looking to buy a tablet I come across 2 tablets which have intrigued me the xperia z3 compact and the nvidia shield k1. I liked them both for the specs and low price. The tablet will be used for surfing the internet, youtube, Facebook, games, watching movies things like that so I was wondering if someone who has both or has had experience can recommend me which one is better or if there is another tablet that is better suited for me.
lilzad said:
I am looking to buy a tablet I come across 2 tablets which have intrigued me the xperia z3 compact and the nvidia shield k1. I liked them both for the specs and low price. The tablet will be used for surfing the internet, youtube, Facebook, games, watching movies things like that so I was wondering if someone who has both or has had experience can recommend me which one is better or if there is another tablet that is better suited for me.
Click to expand...
Click to collapse
I was asking me the same question
I don't know so hard to choose
Nvidia Shield Tablette (179€ in EU)
Ecran: 8"
Définition: 1920 x 1200
356grs
16 GB interne, 2 GB RAM
Extension Micro SD
Batterie: 5200 mAh
Dimension: 221 x 126 x 9.2 mm
CPU: Quad-core 2.2 GHz Cortex-A15
GPU: ULP GeForce Kepler (192 cores)
2015, November
Sony Xperia Tablet Z3 Compact (229€ in EU)
Ecran: 8.1"
Définition: 1920 x 1200
270grs
16 GB interne, 3 GB RAM
Extension Micro SD
Batterie: 4 500 mAh
Dimension: 213.4 x 123.6 x 6.4 mm
CPU: Quad-core 2.5 GHz Krait 400
GPU: Adreno 330
2014, November
plopingo said:
I was asking me the same question
I don't know so hard to choose
Nvidia Shield Tablette (179€ in EU)
Ecran: 8"
Définition: 1920 x 1200
356grs
16 GB interne, 2 GB RAM
Extension Micro SD
Batterie: 5200 mAh
Dimension: 221 x 126 x 9.2 mm
CPU: Quad-core 2.2 GHz Cortex-A15
GPU: ULP GeForce Kepler (192 cores)
2015, November
Sony Xperia Tablet Z3 Compact (229€ in EU)
Ecran: 8.1"
Définition: 1920 x 1200
270grs
16 GB interne, 3 GB RAM
Extension Micro SD
Batterie: 4 500 mAh
Dimension: 213.4 x 123.6 x 6.4 mm
CPU: Quad-core 2.5 GHz Krait 400
GPU: Adreno 330
2014, November
Click to expand...
Click to collapse
If they both are the same price which one would you go for?
lilzad said:
If they both are the same price which one would you go for?
Click to expand...
Click to collapse
I bought the Shield yesterday because I want to play games with.
But If I was not in this situation (more video/internet/app usage) I will definitively go for the Sony because despite the smaller battery she has better battery life and not a little it's like 10 hrs more in certain condition so very well optimized rom from sony.
But maybe I can fix the shield battery with a custom rom will see
The z3's bottleneck is the GPU.
---------- Post added at 07:00 PM ---------- Previous post was at 06:25 PM ----------
plopingo said:
Sony because despite the smaller battery she has better battery life
Click to expand...
Click to collapse
If you set the Shield power mode to just 2 CPU cores, you'll get much more battery life.
Niii4 said:
The z3's bottleneck is the GPU.
---------- Post added at 07:00 PM ---------- Previous post was at 06:25 PM ----------
If you set the Shield power mode to just 2 CPU cores, you'll get much more battery life.
Click to expand...
Click to collapse
And the shields interface remain smooth?
No problem for watching video and stuff or it start lagging?
It still runs smooth. You may sense a difference with highly demanding games. Or when you transcode videos.
Niii4 said:
It still runs smooth. You may sense a difference with highly demanding games. Or when you transcode videos.
Click to expand...
Click to collapse
thanks you for all this information
You cannot disable cores on the k1. You can verify this with cpuz. Also the z3 compact tablet has a terrible processor combo by today's standards and doesn't come anywhere near the k1. It also does not have nvidia specific games, console mode, or gamestream (although this can be done with a 3rd party app call moonlight streaming). The speakers are also not as good and it will probably never get another update, ever. There is no reason to buy the sony tablet over the shield.
seh6183 said:
You cannot disable cores on the k1. You can verify this with cpuz. Also the z3 compact tablet has a terrible processor combo by today's standards and doesn't come anywhere near the k1. It also does not have nvidia specific games, console mode, or gamestream (although this can be done with a 3rd party app call moonlight streaming). The speakers are also not as good and it will probably never get another update, ever. There is no reason to buy the sony tablet over the shield.
Click to expand...
Click to collapse
why you have the setting for manage core if its not working ?
seh6183 said:
You cannot disable cores on the k1. You can verify this with cpuz.
Click to expand...
Click to collapse
Then cpuz is apparently showing wrong data, I'd say.
Plenty apps have compatibility issues with a myriad of Android devices.
Niii4 said:
Then cpuz is apparently showing wrong data, I'd say.
Plenty apps have compatibility issues with a myriad of Android devices.
Click to expand...
Click to collapse
Unfortunately cpuz is accurate and you cannot disable any cores on the shield tablet, despite what the UI might be telling you.
Benchmarks tell a different story.
Niii4 said:
Benchmarks tell a different story.
Click to expand...
Click to collapse
Ill run antutu for you right now, 2 and then 4 cores, and post the results. I assure you the cores will all be active no matter what. They are always firing no matter what you do.
The benchmarks do indicate that there is a performance hit, surprisingly. I ran a total of 8 benchmarks to be sure. I am not sure exactly what is happening, but the cores are not being turned off, that's for sure. You can verify this with any CPU monitoring app, I tried 5 different ones and they all said the same thing. I think maybe when you force the CPU into 2 core mode, it may under clock two of the cores or something, but not disable them. It's helpful to know that the option at least does something and is not just there and fake lol. However like I mentioned in another post of mine, setting the performance mode to custom to enable only two cores, will perform worse for battery like than just using optimized. Forcing two cores for individual games, might be helpful however, but probably not by much. As you can see from the following benchmarks, the scores are not halved by disabling half the cores, they are only slightly reduced. Not exactly sure what's going on, but may look into it further in the near future.
Benchmark 1 - two cores enabled:
CPU: 18378
Benchmark 2 - two cores enabled:
CPU: 18374
Benchmark 3 - four cores enabled:
CPU: 22660
Benchmark 4 - four cores enabled:
CPU: 22409
plopingo said:
But If I was not in this situation (more video/internet/app usage) I will definitively go for the Sony because despite the smaller battery she has better battery life and not a little it's like 10 hrs more in certain condition so very well optimized rom from sony.
Click to expand...
Click to collapse
Under what kind of conditions would that be? Engadget's review says 13.5 to 14 hours for their continuous video loop test, and they estimate 10 hours in "real world" use conditions.
https://www.engadget.com/2015/01/11/sony-xperia-z3-tablet-compact-review/
I get around 7 hours of screen-on time with my Shield, running stock and not messing with the cores.
The Sony tablet's battery life is impressive, for sure. But to say a 10 hour difference, is probably an overstatement.
This thread makes me wonder (yet again) where are the tablets? I know the tablet market is pretty stagnant, but its still a bit ridiculous that we are sitting here comparing a tablet from 2015 versus one from 2014. NVIDIA shelved its plan to update the Shield tablet this year. Not much else on the market. On the higher end, Samsung neglected to update its Galaxy Tab S2 this year.
seh6183 said:
The benchmarks do indicate that there is a performance hit, surprisingly. I ran a total of 8 benchmarks to be sure. I am not sure exactly what is happening, but the cores are not being turned off, that's for sure. You can verify this with any CPU monitoring app, I tried 5 different ones and they all said the same thing. I think maybe when you force the CPU into 2 core mode, it may under clock two of the cores or something, but not disable them. It's helpful to know that the option at least does something and is not just there and fake lol. However like I mentioned in another post of mine, setting the performance mode to custom to enable only two cores, will perform worse for battery like than just using optimized. Forcing two cores for individual games, might be helpful however, but probably not by much. As you can see from the following benchmarks, the scores are not halved by disabling half the cores, they are only slightly reduced. Not exactly sure what's going on, but may look into it further in the near future.
Benchmark 1 - two cores enabled:
CPU: 18378
Benchmark 2 - two cores enabled:
CPU: 18374
Benchmark 3 - four cores enabled:
CPU: 22660
Benchmark 4 - four cores enabled:
CPU: 22409
Click to expand...
Click to collapse
maybe not deactivate but consume less battery it's what I need
plopingo said:
maybe not deactivate but consume less battery it's what I need
Click to expand...
Click to collapse
So under what circumstances will you switch to 2 core mode to save battery? The tablet has great battery until it fires up a game and there's not much you're going to do about gaming battery life.
seh6183 said:
So under what circumstances will you switch to 2 core mode to save battery? The tablet has great battery until it fires up a game and there's not much you're going to do about gaming battery life.
Click to expand...
Click to collapse
more battery for movie/netflix
plopingo said:
more battery for movie/netflix
Click to expand...
Click to collapse
Switching over to 2 cores during a movie will likely cost you battery due to the fact that the minimum clock speed for custom is much higher than the minimum clock speed for optimized.