Small Mobile Stick Computer - Android Stick & Console Computers General

I'm looking for an Android Stick for mobile, battery driven operation.
Can this be done?
Graphics hardware is not strictly needed - so could be disabled to save energy.

You can use any TV-Stick and power it with a portable power bank like this one:
http://www.amazon.com/6000mAh-Portable-Rapid-Recharge-External-adapters/dp/B00EF1OGOG/ref=sr_1_4?ie=UTF8&qid=1387725779&sr=8-4&keywords=powerbank+5v
There are also ones that run with AA batteries if you prefer that. The only thing you have to keep in mind is that the output curren has to be appropriate. For single-core sticks, 1A will do, but for dual or quad core you should look for a power bank which outputs at least 2A.
But i don't really get why someone would need a battery powered tv stick. You need power for your TV, so why can't you use that for the Stick also?

I don't need a screen for this. I want to disable the graphics chip (or even rip it out). How much power do you think does the typical GPU consume, compared to the CPU? (Or all everything else?)
If such an Android stick is getting hot, is it rather the CPU or the GPU?

These devices (and i think all other devices that run Android) are all based on SoCs (System on a chip). This means that CPU and GPU is the same chip, so sorry, no ripping out of the GPU .
Power consumption depends on the task that is performed by the chip, and so does the the heat generation.

There should be a difference in power consumption between a) not performing any graphics related tasks and b) putting the GPU, the video memory and the HDMI port all into suspend mode - similar to sending a smartphone into suspend mode, while keeping the CPU running.
How much power saving b) will amount to, I don't know. Will it be closer to 5%? Or closer to 30%?
The portable power bank you linked to above is a little big for my use case. I would prefer to use a smartphone battery to power, say, a single core device. Do you think this would be impossible?

Android HDMI sticks are USB powered, so they need 5V. If you have some knowledge about electonics you can use any battery you want. You just have to build a circuit that converts the voltage of the battery (smartphone batteries have 3.7V) to 5V.
The power consumption is very hard to estimate. I would say a single core with idling GPU will consume about 0.3 - 0.5 Ampere depending on the CPU load.
But if you want a small android device powered with a smartphone battery, why don't you use a smartphone? What's the advantage of a hdmi stick?

DaPhinc said:
What's the advantage of a hdmi stick?
Click to expand...
Click to collapse
I do not need a screen. So this is an advantage. I hope to find a battery that fits hush with the board. I like to carry the device around in a pocket. In addition to suspending the graphic hardware, I will try to throttle down the CPU, in order to make it consume less energy.
Do you think I will be able to get the power consumption down to levels of an average smartphone? Or are there fundamental differences that make this impossible? In which case I would indeed need to look for a smartphone SOC based solution.

The SoCs in these sticks are also used in cheap smartphones, so the power consumption will be about the same. If you want to underclock you CPU you should look for a device that has custom kernels, because most stock kernels do not allow over- or underclocking.

DaPhinc said:
look for a device that has custom kernels
Click to expand...
Click to collapse
May I ask for HW suggestions, pls? ATM, I have no clear idea what would be the best hardware/vendor relative to my requirements.
Availability of custom kernels would be good. Availability of usable kernel sources would be even better. A big plus would be availability of 4.3.x - due to improvements in the Bluetooth stack, among other things. Any chance to run pure AOSP on any of these stick devices?
Thx.

I am quite sure that there is no device that meets all your requirements.
I know that there are custom kernels and even a (bad) cyanogen mod build for the mk808, but it is a dual-core, not a single core as you wanted.
And Android 4.3 is not available for any of these sticks. Maybe it will come as an update for the current generation of quad core sticks, or maybe not until the next generation (mid 2014) arrives.

Related

[Q] Battery Life - dual core vs single core

Hi,
Does anyone know if there will be any perceivable difference in battery life between the latest batch of smartphones based on single core 45nm SoCs and the new 45 nm dual core versions.
All manufacturers state state improved performance AND lower power consumption with the dual cores, but I am wondering how this will affect my day-to-day battery life. I actually read somewhere that e.g. Tegra 2 phones may drain the battery quicker instead of saving power.
I was planning to buy the Desire Z or Galaxy S, but I may also wait for their upgrades if this will mean better battery life. I am not too eager about the performance improvements as I am not a gamer and will likely not feel the benefits (I mostly browse 3G or WiFi/use GPS/occasionally may play a movie + a couple of calls a day).
I'm expecting the battery life to be a bit low seeing that they have a "lot" of things in them.... I reckon it maybe the same as current smartphones?
Apparently battery times will be better, let´s see...
what i have learned that dual cores will save energy. (well atleast tegra 2) that soc has sth like 7 different cores if i remember right, each and every one of them made for a specific purpose (audio play/ video encode/ decode/ gpu....a nd other things i dont remember) so the soc it self will use only what YOU need at that very moment, lets say ur watching a vid, so only the video decode core will work whereas other should be in standby or sth.
(imo battery life should increase cuz of this, not sure how much tho)
and excuse me if my knoweledge isnt exatcly correct on this matter
There are two train of thoughts here:
1) as the die gets smaller (65nm[1st gen snaps] to 45nm[2nd gen snaps] to 40nm [tegra2] to 28nm[3rd gen snaps], etc), the processor tends to draw less energy. So yes, most dual cores (having a smaller die) SHOULD be more energy efficient.
2) With dual core, executions get carried out a lot faster than they were on single core. As evident by the benchmark done on dual cores (+2000 and up), it'll take less time to start/process a program and UI, in theory, should be a lot faster and smoother (that is if manufacturers don't start ****ing around with a UI and make it sluggish). Being that it's faster and a lot more versatile than single core, people will tend to youtube a lot, play games a lot, and generally use a lot more multimedia applications. Being that we don't have a self-sustaining energy source that isn't radioactive, the battery will drain from excessive use.
So at the end of the day, it depends on your use. Yes, dual cores are more energy efficient than single core, but in the hands of a 15yo teenage girl with more life than paris hilton, they probably get the same battery life as any other phone out there.
Hi guys,
thanks for the comments and predictions. My prediction would be that there would be hardly any noticeable change between the single & dual core 45 nm chips (given same usage of course), similar to how there was almost no change when switching from 65nm to 45nm chips - which are more energy efficient as well.
the better energy efficiency seems to be quickly soaked up by more power hungry hardware and software. so it all boils down to whether one needs the better performance as the battery life will likely stay the same.
But this of course is only a prediction based on past observation. I hope I am wrong and I am still considering if it is worth waiting for the dual core devices to hit the market. In the meantime if anyone has had a chance to play with such a device (tablet?), any additional info will be welcome

[Q] CPU usage monitor app (with dual core support): can we trust it?

So I bought the application
CPU usage and Frequency Monitor (dual core support)
It's on the market, and you can find its thread here at xda:
http://forum.xda-developers.com/showthread.php?t=1160319
The dev says there a limitation for the Sensation, but I bought it anyway:
****** NOTE ******
HTC Sensation Owners
There is a device limitation with reading the CPU Frequency. I am looking into a workaround for this problem.
****** NOTE ******
Click to expand...
Click to collapse
I asked the question in the dev thread, hopping for a fast answer,but want to ask here too:
Can we trust the CPU usage, individual for each core, that is displayed by the app????
This app displays, at the top of the screen, in the notification bar, what core is used, with one column for each core
So with our asynchrone dual core, one of the column is often empty, when the other can be half full, or full, in normal use, this seem to be OK (even if the used freq for each core is not read, as the dev says, but is the usage correctly read??)
But I noticed the second column, so second core, very often starts filling too!
I used to think Android 2.3 is not supposed to handle dual core, so that almost all the time, only one is used????
That was for me, and for what I've read, the big reason why we have really bad scores on every benchmark?
If CPU monitor is right, I can see the second core easily waking up when the first one is already full, sometimes just a little, sometimes 50%, sometimes 100%
INCLUDING DURING BENCHMARKS where CPU monitors displays both cores running at 100%!!!!
So what is true here? Is CPU monitor fooled by Android 2.3 and shows the second core waking up when it's not?
Or does indeed our second core easily wake up, including in benchmarks, meaning our pittyful scores will never be greatly improved since both cores already release their power??
Need more infos on these asynchrone dual core, the way they work and are supported by Android 2.3, what HTC did to implement this, etc
Not a single answer from a dev of someone with more knowledge than me concerning dual core architecture and the way Android can handle it?
I had an answer from the dev of CPU usage monitor:
The CPU usage information is abstracted in both cases at the App level. Apps just need to read the standard CPU usage information at the OS level to gather its data. Control of when and how the dual core magic works is not a worry at the app level since the OS handles it. Hope this helps.
Click to expand...
Click to collapse
So...........
=> In normal use, the app shows one core only running, very low if no app running, sometimes when the first one is full, the second one starts working a little after the first one is full, 10%, 25%, etc, for apps requiring a little more power, everything seems very logical for an asynchrone dual core CPU (wasn't I told that Android 2.3 doesn't really manage async dual core???? When we overclock, don't we overclock only one core?)
But when doing a Benchmark, or playing heavy openGL games, the app displays that both core run at 100%, CPU at its max power for both cores!
So if it is, even with a better rom once S-OFF or better drivers, our bench scores will always be very low
I need this to be confirmed or not, if we already have both cores running at 100% during bench or openGL games, we can't expect much more from our Sensation :-(
I'd be interested to understand this also.
I appears that the Gingerbread doesn't support Dual-Cores properly.
Have a look at: http://groups.google.com/group/android-platform/browse_thread/thread/b754408b9af17e55?pli=1
I guess we need an upgraded Kernel and associated libraries. I must admit I was surprised when I started looking.

Surface Pro 2 CPU Limited

Hi all,
I've had my Surface Pro 2 256/8 since release and all has been fine until (possibly) the firmware update.
Turbo Boost was working fine and the CPU was going up to its maximum of 2.6Ghz but it is now seemingly capped at 2.23Ghz.
I've checked in PC Settings, Task Manager and CPU-Z, the maximum that the CPU ever reaches is 2.23Ghz, as indicated.
Anyone else experienced this? I have tried all power profiles (Performance, balanced, Power Saver and there's no difference).
Thanks!
EDIT: Having used HWiNFO64 on the High Performance profile I can see that the core is limited to x23 which is producing the 2.3Ghz clock speed. It occasionally indicates x26 (2.6Ghz) for a millisecond before ThermMon shows that it is being throttled back to x23. So it appears it's not reaching maximum speed to keep the heat lower, why this has happened is still inconclusive..
Have you tried to change the CPU maximum utilization in power settings?
Sent from CM10.1 U9200
>I've had my Surface Pro 2 256/8 since release and all has been fine until (possibly) the firmware update.
You've answered your own question. One reason to cap speed is for battery life. That's what the latest firmware update provides. You've found the downside.
http://www.theverge.com/2013/11/4/5064026/microsoft-surface-pro-2-battery-life-firmware-update
So they cap turbo mode to increase battery life, given the SP2 is advertised as having an Intel i5 CPU with no mention of speed on the MS site I think they'll get away with it...
e.mote said:
>I've had my Surface Pro 2 256/8 since release and all has been fine until (possibly) the firmware update.
You've answered your own question. One reason to cap speed is for battery life. That's what the latest firmware update provides. You've found the downside.
http://www.theverge.com/2013/11/4/5064026/microsoft-surface-pro-2-battery-life-firmware-update
Click to expand...
Click to collapse
Well, in that case, the CPU shouldn't be capped when NOT running on battery power, should it?
There is also cooling to consider regardless of if it is on battery or mains. Although was heat particularly problematic pre-update?
>Well, in that case, the CPU shouldn't be capped when NOT running on battery power, should it?
As another noted, SP2 slicks never said CPU would be running at max spec.
PCs are normally more configurable. But if you enter SP2's UEFI setup, the only thing you can change is Secure Boot. MS is emulating Apple in more ways than one.
The cynics among us (guilty as charged) would say that MS handled this just right: Release the device with uncapped speed to get the best possible performance for reviews. Then afterward, cap the speed to claim "improved battery life" as well. If MS had capped the speed to start with, SP2 would be no faster than SP, and would get slammed hard. SP2 is already slammed as having minimal improvements over SP.
Reviewers aren't going to take the trouble to revise their reviews, and even if they did, not many people will re-read them. So, with this method, you can indeed have your cake and eat it too. Think of it as a more "legal" form of juicing performance tests without the explicit cheating that Samsung and others resorted to.
e.mote said:
> SP2 is already slammed as having minimal improvements over SP.
Click to expand...
Click to collapse
Anyone thats knows the difference between the CPU's will know that theres not a huge speed increase though, the only thing they should of done was stick a decent SSD in there, I've got an unused PX-256M5M sat on my desk that reads/writes at near on full sized 2.5" SSD speeds where as the mSSD's have always had half decent read and poor write.
Other than that what else is there to improve on, the camera perhaps as per the Surface2, battery life, check, the only other thing the Pro/Pro2 needs imho is more accesories, yes they are coming but should of been ready at launch imho, I'm crying out for a dock...

x86 only ROM Building

I was wondering about... Is it true if x64_x86 build (arm64) uses more RAM than just x86 (arm)? Considering this phone can't take any advantages of 64 bit OS (cmiiw), I just wonder whether it's possible to build the x86 ROM version of any available custom ROM out there?
I know only a little about theoretical of why it wouldn't be easy task / wouldn't possible perhaps due to its kernel only available in x64_x86 build... or it needs a lots of codes rework (since C++ x64 has additional instruction/data types which not available in x86), or its blobs only available in x64_x86... So, anyone has more convincing arguments about this task?
Hi x3r0.13urn
Just to clarify things: our Mi4c/s is built with a arm 64 SoC (system on a chip), and x86 means a different processor architecture (like computer cpu's) so there are 4 (common) cpu architectures:
"Normal" CPU: x86 (32 bit) and x64 (64bit)
"Mobile" CPU: Arm (32bit) and Arm64 (64bit)
So your question is, if it would have any benefits building an arm (32 bit) kernel/rom/Firmware for our arm64 Mi4c/s.
As the CPU itself is made for Arm64 i think there are lots of optimizations (things I, the noob, doesnt understand) in the Kernel/Firmware/Rom so it would be useless to go back to the old arm 32 bit environment (DEV's correct me if i'm wrong).
An idea of mine: Why not build an ARM64 Rom with small textures and low resolution? The screen would look **** but i think there would be many battery savings and more performance to use for other things..
Greetings
Wertus
I don't see why you need better performance, Mi4c is an already powerful device.
Regarding OP, there is no 32bit kernel developed for any msm8992 phone, as Qualcomm made it a 64bit chip and optimized the kernel for 64bit usage. So no, we can't compile a 32bit kernel for msm8992, and even if we did, it would be bad.
Regarding the smaller textures, a lot of Android is now vector graphics, and you can't make them smaller, and having smaller textures than we need would be useless as they would have to be upscaled to display them. The phone might even consume more power to upscale them.
I see... So, if it's true, the problem there's no x86 kernel source thus afflicts other aspects. It's quite shame... since even up till today, 32-bit only OSes (Windows, *nix) are still available.
I agree on, yes SD 808 packs a punch to withstand today's need. Regarding about textures, I take it we're talking about resolution downscaling (from 1080p to 720p?), would save few MiB of RAM usage, and it will also lower CPU usage, though in my personal experiences, it won't prolong battery life very much.
x3r0.13urn said:
I see... So, if it's true, the problem there's no x86 kernel source thus afflicts other aspects. It's quite shame... since even up till today, 32-bit only OSes (Windows, *nix) are still available.
I agree on, yes SD 808 packs a punch to withstand today's need. Regarding about textures, I take it we're talking about resolution downscaling (from 1080p to 720p?), would save few MiB of RAM usage, and it will also lower CPU usage, though in my personal experiences, it won't prolong battery life very much.
Click to expand...
Click to collapse
Windows and Linux distros must run on a lot of machines, and be compatible with all.
The Linux kernel found in a Android device is much different, as it targets one specific chipset, which in this case 64bit. Also, 64bit is overall much better than 32bit, the only drawbacks being RAM usage(but the differences are really small!).
No, I was talking with the other guy which mentioned using lower textures.
About lowering resolution, it can actually get worse. In games where FPS isn't locked CPU can become a bottleneck, because GPU has less work to do and therefore it can finish it faster, then it requests more data from the CPU, and so on, until the CPU can't keep up.
Overall you can't really improve this device much on the speed side of things, and ROM developers try to improve battery life already. The problem is that users use their devices in various ways and you can't make everyone happy.
So yeah, we do our best to optimise stuff and if it hasn't been done then it can't be done or it doesn't improve anything.
Cozzmy13 said:
No, I was talking with the other guy which mentioned using lower textures.
Overall you can't really improve this device much on the speed side of things, and ROM developers try to improve battery life already. The problem is that users use their devices in various ways and you can't make everyone happy.
So yeah, we do our best to optimise stuff and if it hasn't been done then it can't be done or it doesn't improve anything.
Click to expand...
Click to collapse
I meant to reduce the resolution (sry, explained it wrong with textures).
Well, i guess a lot of people want to use the Hardware they have (why 720p if u can use 1080p "to show ur friends"?)
I just wonder if one could compile- lets say- an HTC desire HD rom, which works fluently at 700mhz with a single core cpu (and even an old one). So you could adjust the governor settings of the mi4c (with a better optimized cpu, gpu, bigger battery...) and get an incredible battery.
I understand that this is not wanted from most of the users because u have an old system, missing many functions and u cant use your new device to the full excess because of the software.
But to use this rom when i dont have the chance to charge my mobile for 3-5days (with multirom) would be great (if that works how i imagine it).
Edit: And yes, the mi4c is very nice for performance, just the bat could be better
With regards to the battery: unfortunately for the short term, the solution is to buy an external battery (i.e. power bank), or do "extreme" things like switching on aeroplane mode, dim screen, powersave governor, etc.
In the medium term, there are things developers can do. However this work is very detailed and requires lots of testing. There's plenty of tweaks one can make with the CPU core load balancing, voltages, frequencies etc. It's difficult because there are so many knobs to twiddle, and so many ways to make things worse.
terence.tan said:
With regards to the battery: unfortunately for the short term, the solution is to buy an external battery (i.e. power bank), or do "extreme" things like switching on aeroplane mode, dim screen, powersave governor, etc.
Click to expand...
Click to collapse
Well, with all connections shut down (except for mobile network, without mobile data) it holds about 2 days (i'm still taking pictures and phoning), which's not bad. i also use my own governor values (gave me the most).
But where to charge the power bank (they are empty fast on festivals..)? xD
Thx for your explanation/help, but i think i'll just end up attaching a stronger battery to my one x (which holds around 1.5-2 days with same usage).
Do you know if i need to adjust any kernel parameters for a new bat (i dont think so)?
wertus33333 said:
Well, with all connections shut down (except for mobile network, without mobile data) it holds about 2 days (i'm still taking pictures and phoning), which's not bad. i also use my own governor values (gave me the most).
But where to charge the power bank (they are empty fast on festivals..)? xD
Thx for your explanation/help, but i think i'll just end up attaching a stronger battery to my one x (which holds around 1.5-2 days with same usage).
Click to expand...
Click to collapse
Yeah. I'm not sure that there's a lot more you can do, without a developer actually running a profile on your phone to see what's drawing power.
Along the same lines, performance is important. We have a concept called "race to sleep", which means that the phone runs faster for a shorter time, then can go to idle. This saves power.
One strategy is to use hardware acceleration where possible. For example, using crypto hardware instead of in software. This is one example of a medium-term project that requires lots of testing, because if you get crypto wrong, you can lose your data...
wertus33333 said:
Do you know if i need to adjust any kernel parameters for a new bat (i dont think so)?
Click to expand...
Click to collapse
I'd just try it and see if it auto-detects.
Re: aeroplane mode and power saving. Here's the quote I was looking for:
During its research for Project Volta, Google took a Nexus 5 and put it in airplane mode and measured how long it took to die with the screen off. In normal use, the device struggles to last a full day, but while idling like this, it lasted a full month. The takeaway was that if you can just get the phone to stop doing stuff, your battery life will greatly increase. After this research, it's no surprise to see Google focusing on deeper sleep modes.
Click to expand...
Click to collapse
Source: https://arstechnica.com/gadgets/201...permission-controls-fingerprint-api-and-more/
This is what I mean by, with a persistent developer who looks for all the details, you can get results like above...
terence.tan said:
Re: aeroplane mode and power saving. Here's the quote I was looking for:
Source: https://arstechnica.com/gadgets/201...permission-controls-fingerprint-api-and-more/
This is what I mean by, with a persistent developer who looks for all the details, you can get results like above...
Click to expand...
Click to collapse
My op2 holds around 3 weeks if i only use it as an alarm in the morning (with some minor tweaks, without gapps).
What really made me think is that the op2 with its 5.5" screen and a 3300mah bat gets around 10h SoT (lowest brightness, just idling) and the mi4s with its 3260mah and 5" screen only about 4h SoT (lowest brightness, idling).
Its not that hard to get results like in your link if you adjust the gov to just keep cpu load on the lowest possible. I also got 2 weeks with an SGS plus just idling around, but when i use it to browse the web its empty in 30mins xD.

14nm/1080p vs 28nm/720p - Which Will Use Less Power?

Hi,
Need some advice here. Two phones for sale, both same brand/model however two separate specs and I need the one that will draw the least battery as I use the phones screen time for 8 straight hours a day (satnav) so the less battery draw the better. I do not care about anything else!
1) Snapdragon 625 - 14nm but 5.5" 1080p Screen
2) Snapdragon 435 - 28nm But 5.0" 720p Screen.
So the first phone has the newer 14nm CPU however has a bigger screen area (5.5") and Full HD.
However the second phone has a smaller screen area at 5.0" and is running less resolution @ 720 but has the older 28nm CPU which draws more power.
So which setup, in theory, would use less power?
NokiaBricks said:
Hi,
Need some advice here. Two phones for sale, both same brand/model however two separate specs and I need the one that will draw the least battery as I use the phones screen time for 8 straight hours a day (satnav) so the less battery draw the better. I do not care about anything else!
1) Snapdragon 625 - 14nm but 5.5" 1080p Screen
2) Snapdragon 435 - 28nm But 5.0" 720p Screen.
So the first phone has the newer 14nm CPU however has a bigger screen area (5.5") and Full HD.
However the second phone has a smaller screen area at 5.0" and is running less resolution @ 720 but has the older 28nm CPU which draws more power.
So which setup, in theory, would use less power?
Click to expand...
Click to collapse
see
What's your next smartphone / What should I buy by poseidon5213
and
**DEVICE SUGGESTION THREAD** -- Not sure what device to buy? Ask here! by KidCarter93
Sent from my XT1060 using XDA Labs
Hi, my question was a specific one regarding power draws, not sure if either of those threads are ideal for the question at hand.
NokiaBricks said:
Hi, my question was a specific one regarding power draws, not sure if either of those threads are ideal for the question at hand.
Click to expand...
Click to collapse
It comes down to usage I suppose. If the task at hand requires a lot of processing power but is less taxing on the screen/GPU then the SD 625 is better. If its the other way around then choose the device with the SD 435.
What about battery capacity, are those the same as well?
Freewander10 said:
It comes down to usage I suppose. If the task at hand requires a lot of processing power but is less taxing on the screen/GPU then the SD 625 is better. If its the other way around then choose the device with the SD 435.
What about battery capacity, are those the same as well?
Click to expand...
Click to collapse
Yup both phones have the same battery capacity, they are both identical in most respects, one is the note version the other is not.
The usage is 100% satnav (Waze) no other apps will be running. Any ideas on the usage of Waze?
NokiaBricks said:
Yup both phones have the same battery capacity, they are both identical in most respects, one is the note version the other is not.
The usage is 100% satnav (Waze) no other apps will be running. Any ideas on the usage of Waze?
Click to expand...
Click to collapse
I'd go for the newer/more efficient 625. As things such as GPS, Mobile Data and WiFi usage are embedded in the SoC. Screen brightness and density can be easily be adjusted to reduce power consumption. And you can also underclock the CPU (or turn on some sort of power saver mode) to squeeze out as much juice out of the battery as possible. The difference between a 720p and 1080 p device isn't that much anyway. But the improvements of the 625 over the 435 is far greater.

Categories

Resources