GPU Overclock performance - ONE Q&A, Help & Troubleshooting

Hello 1+1 owners,
I just wanted to ask whether does GPU Overclocking on your OnePlus devices improve graphics performance. I've seen some kernels which supports GPU OC, so I'm asking those who already tried it.
I'm asking this because I've added GPU OC to my custom kernel for Lenovo Vibe Z2 Pro (which has the same Snapdragon 801), and although the GPU itself goes to 657MHz frequency step, I haven't noticed any improvements whatsoever, either in GPU Benchmarks (3DMark, GFXBench) or in android games.

Electry said:
Hello 1+1 owners,
I just wanted to ask whether does GPU Overclocking on your OnePlus devices improve graphics performance. I've seen some kernels which supports GPU OC, so I'm asking those who already tried it.
I'm asking this because I've added GPU OC to my custom kernel for Lenovo Vibe Z2 Pro (which has the same Snapdragon 801), and although the GPU itself goes to 657MHz frequency step, I haven't noticed any improvements whatsoever, either in GPU Benchmarks (3DMark, GFXBench) or in android games.
Click to expand...
Click to collapse
That's because GPU overclocking isn't possible on this device, or any non A-family device (our chipset is B-family, and almost all Qualcomm devices released in the last 2 years are B-family). The GPU clock table is stored in TrustZone, so we can't touch it.
Sent from my A0001 using XDA Free mobile app

I wonder why they did this. Are they pushing people to get and pay for devices with their newer soc's? That's bad news, really.
I remember modifying the gpu freq table on my Nexus 7 (T3) which helped to squeeze some extra power from the device.
Anyway, thanks @Sultanxda for explanation.

Electry said:
I wonder why they did this. First thing that came to my mind was the idea that they are pushing people to get and pay for devices with their newer soc's. That's bad news, really.
I remember modifying the gpu freq table on my Nexus 7 (T3) which helped to squeeze some extra power from the device.
Anyway, thanks @Sultanxda for explanation.
Click to expand...
Click to collapse
I have a Nexus 7 2012 and all I can say is that Tegra is nowhere near the same level as Snapdragon. The T3 is super laggy, whereas even Snapdragon chips from 3 years ago are still running smoothly today. Our GPU and all of our hardware in general should be the least of your worries; the Snapdragon 801 is super overpowered, and the GPU on this thing won't be a cause for bottlenecks while gaming for probably another 2 years. 578MHz is more than plenty right now.

Sultanxda said:
I have a Nexus 7 2012 and all I can say is that Tegra is nowhere near the same level as Snapdragon. The T3 is super laggy, whereas even Snapdragon chips from 3 years ago are still running smoothly today. Our GPU and all of our hardware in general should be the least of your worries; the Snapdragon 801 is super overpowered, and the GPU on this thing won't be a cause for bottlenecks while gaming for probably another 2 years. 578MHz is more than plenty right now.
Click to expand...
Click to collapse
True about T3, probably nvidia's biggest dissapointment.
Although Adreno 330 is already struggling with some more demanding games at 1440p (which is the resolution my phone has), GPU usage constantly hits 100% where at 1080p it only hit 60-70% (measured with GameBench). I was afraid that I will have to switch permanently to 1080p soon, just to maintain playable framerates (25+).

Electry said:
True about T3, probably nvidia's biggest dissapointment.
Although Adreno 330 is already struggling with some more demanding games at 1440p (which is the resolution my phone has), GPU usage constantly hits 100% where at 1080p it only hit 60-70% (measured with GameBench). I was afraid that I will have to switch permanently to 1080p soon, just to maintain playable framerates (25+).
Click to expand...
Click to collapse
You can try changing the GPU governor to performance. Open a terminal emulator app and run these two commands:
Code:
su
echo performance > /sys/class/devfreq/f*/governor

Related

Tegra 3 = Beast!

Wow, just reading this and watching the video got me really excited!
Quote: "...its benchmark puts Kal-El at a higher performance bracket than even Intel's Core 2 Duo full-on-PC processors."
Enjoy: http://pocketnow.com/android/nvidia-quad-core-kal-el-in-android-devices-this-summer
I guess my next phone will somewhere on par with my [email protected], nah not quite but still impressive.
Its freakin ridiculous isn't it, I can't imagine how powerful wayne, logan, or even stark will be.
By the way, those are the architectures coming after Kal-El as seen in the roadmap here
http://www.anandtech.com/show/4181/...-a9s-coming-to-smartphonestablets-this-year/1
Can't wait for my Q6600 to have a little brother as well.
dreadlord369 said:
Its freakin ridiculous isn't it, I can't imagine how powerful wayne, logan, or even stark will be.
By the way, those are the architectures coming after Kal-El as seen in the roadmap here
http://www.anandtech.com/show/4181/...-a9s-coming-to-smartphonestablets-this-year/1
Click to expand...
Click to collapse
Wow sick! I had a feeling the technology was gona explode once dual core starts being implemented into phones but this is just ridiculous. I wander which C2D they are comparing to though. Can't wait to play some Crysis on my phone !!
omg it looks so cool!
7
Its lie, arm can not beat intel dual core cpus for next three year
It might be better then atom dual...
Sent from my LG-SU660 using XDA App
uhh three years is too long if they havent already beat some dual core chips, least thats what i think...specially since the kal-el and omap 5 cpus and whatever qualcomm have planned are gunna be freaking awesome!
OMG!!!! Its amazing
Mobile phones better than my first PC
Well since nvidia is supposedly releasing quad core in q4 of this year I say that computers will prolly eventually die out. Especially since this year smartphone sales beat computers...just a thought
HTC HD2 w/ 2.3 : )
CTR01 said:
Well since nvidia is supposedly releasing quad core in q4 of this year I say that computers will prolly eventually die out. Especially since this year smartphone sales beat computers...just a thought
HTC HD2 w/ 2.3 : )
Click to expand...
Click to collapse
Funny you mention that, I was just in uni talking about networking (my major) and technology and a classmate said the same thing. I would say it could happen in maybe 20+ years.
I would like to see a Tegra 3 rendering a complex 3D scene or something like that which would really show it's performance.
Is this the Q6600 club or what? <3
Sent from my HTC Vision using Tapatalk
I have an Athlon X3 435 at 3.6 ghz. Can go up to 3.8 ghz as well. But too much v-core.
Although they are saying these newer processors are supposed to be much more efficient, are these dual and quad core processors going to be a viable option with the today's battery technology?
Or is it going to be more of a "use it if it is available" for the app devs, and therefore negating any positive improvements in battery life?
icecold23 said:
Although they are saying these newer processors are supposed to be much more efficient, are these dual and quad core processors going to be a viable option with the today's battery technology?
Or is it going to be more of a "use it if it is available" for the app devs, and therefore negating any positive improvements in battery life?
Click to expand...
Click to collapse
Yeah, I don' think the battery technology is on par or evolving on par with the processors. At this rate, we'll have "stationary" tablets with the current battery technology.
icecold23 said:
Although they are saying these newer processors are supposed to be much more efficient, are these dual and quad core processors going to be a viable option with the today's battery technology?
Or is it going to be more of a "use it if it is available" for the app devs, and therefore negating any positive improvements in battery life?
Click to expand...
Click to collapse
It just really depends on how optimized these cores are for power. It's not adding cores that gives a higher TDP, it's the vcore of the core and the frequency the cores run at. But really, I can't see cpus going any other way but multicore or multithread. It's more efficient for power and performance to have 2 cores running at 1 ghz each, instead of having a cpu at 2ghz that will have a higher tdp and vcore to keep it stable. If the cores are a smaller die size, then it works out perfectly.
vbetts said:
It just really depends on how optimized these cores are for power. It's not adding cores that gives a higher TDP, it's the vcore of the core and the frequency the cores run at. But really, I can't see cpus going any other way but multicore or multithread. It's more efficient for power and performance to have 2 cores running at 1 ghz each, instead of having a cpu at 2ghz that will have a higher tdp and vcore to keep it stable. If the cores are a smaller die size, then it works out perfectly.
Click to expand...
Click to collapse
agreed...while im no cpu expert i do know the slight basics and did a little reading that agrees with vbetts. they said the 4 core kal-el nvidia cpu is supposed to have ~12 hours of play back for hd video...least thats what someone on a thread of mine posted...
vbetts said:
It just really depends on how optimized these cores are for power. It's not adding cores that gives a higher TDP, it's the vcore of the core and the frequency the cores run at. But really, I can't see cpus going any other way but multicore or multithread. It's more efficient for power and performance to have 2 cores running at 1 ghz each, instead of having a cpu at 2ghz that will have a higher tdp and vcore to keep it stable. If the cores are a smaller die size, then it works out perfectly.
Click to expand...
Click to collapse
Yeah exactly, I personally thought the move to dual core would be sooner with a 2 500mhz core cpu or lower since that would still be better then a single 1Ghz chip.
Battery is definitely a issue, with today's technology I wonder how much these chips will consume at 100% load or when playing a game which use most of the devices grunt. On my DHD I can take it up to 1.9Ghz stable and if i'm playing FPse while at that frequency current widget show consumption of around 425mA, while at 1Ghz it's around 285mA. That's quiet a difference! So in order for these chips to be efficient they shouldn't use much more battery then todays chips.
I love my Q6600, max OC I could get was 4Ghz but it required 1.6v Vcore and on air that was HOT. Still I made it into Windows and did some benching, 13.110s on a 1MB SuperPI/1.5 XS mod
CTR01 said:
agreed...while im no cpu expert i do know the slight basics and did a little reading that agrees with vbetts. they said the 4 core kal-el nvidia cpu is supposed to have ~12 hours of play back for hd video...least thats what someone on a thread of mine posted...
Click to expand...
Click to collapse
Yeah I read that as well or heard it somewhere.
Yeah exactly, I personally thought the move to dual core would be sooner with a 2 500mhz core cpu or lower since that would still be better then a single 1Ghz chip.
Click to expand...
Click to collapse
If the app is multithreaded capable, then yes. Easily better. But 2 500mhz cpus would probably be best for multitasking.
Battery is definitely a issue, with today's technology I wonder how much these chips will consume at 100% load or when playing a game which use most of the devices grunt. On my DHD I can take it up to 1.9Ghz stable and if i'm playing FPse while at that frequency current widget show consumption of around 425mA, while at 1Ghz it's around 285mA. That's quiet a difference! So in order for these chips to be efficient they shouldn't use much more battery then todays chips.
Click to expand...
Click to collapse
Battery has always been an issue though, even on my old Moment the battery sucked. But that's what you get with these I guess. But man, 1.9ghz stable from 1ghz! For a small platform, that's pretty damn impressive.
I love my Q6600, max OC I could get was 4Ghz but it required 1.6v Vcore and on air that was HOT. Still I made it into Windows and did some benching, 13.110s on a 1MB SuperPI/1.5 XS mod
Click to expand...
Click to collapse
Ouch, how long did the chip last? I've got my 435 at 1.52vcore. I can go higher but I need this chip to last me for a year or so.
I can't believe that they set the time-frame for the release as early as they did. Hopefully they will live up to this standard

2011 Tegra 3 (Kal-El) Tablets

Hey guys,
Could someone please let me know which tablets are being/have been released in 2011 sporting the Tegra 3 (Kal-El) processor? Especially if they're out/easily importable to the UK! Thanks
*EDIT* The one I'm aware of atm is the Asus Transformer Prime - does anyone have a definite release day for this?
Prime is the only one that's anounced yet!
http://www.engadget.com/2011/11/09/transformer-prime-detailed-10-inch-super-ips-display-12-hour
"December" is still all I've heard of a release date. But it supposed to be a worldwide release so I'm assuming shortages will be a given.
mordbane said:
http://www.engadget.com/2011/11/09/transformer-prime-detailed-10-inch-super-ips-display-12-hour
"December" is still all I've heard of a release date. But it supposed to be a worldwide release so I'm assuming shortages will be a given.
Click to expand...
Click to collapse
Ah cool, so I'll be able to order it from the UK? Hoping to get it by xmas ya know
Prime in december
Cyrano4 said:
Prime in december
Click to expand...
Click to collapse
Not for the UK though?
Nice Tegra 3 is very good
Ant38 said:
Nice Tegra 3 is very good
Click to expand...
Click to collapse
why? beacuse of 4 cores?
You won't use the power of them... at the moment Android doesn't use 2 cores properly
2, 4 or idk 16 cores doesn't mean that system will be faster
look at HTC 7 Mozart with WP7.5 - 1 core snapdragon with 512 MB and system is very smooth and fast
I think that producers must focus at optimization of their version of Android
making smartphones with better hardware is not a solution - it only makes more problems
darasz89 said:
why? beacuse of 4 cores?
You won't use the power of them... at the moment Android doesn't use 2 cores properly
2, 4 or idk 16 cores doesn't mean that system will be faster
look at HTC 7 Mozart with WP7.5 - 1 core snapdragon with 512 MB and system is very smooth and fast
I think that producers must focus at optimization of their version of Android
making smartphones with better hardware is not a solution - it only makes more problems
Click to expand...
Click to collapse
Agree with you when you say about Android, which doesn't take full advantage of 4 cores.
But the main feature in the quad-core technology is the battery life saving.
Eventually, in the next year when there will be optimized app for dual and quad- core CPU, this tablet ( or any other 4cores phone) will rock for many, many months.
yukinok25 said:
But the main feature in the quad-core technology is the battery life saving.
Click to expand...
Click to collapse
4 core CPU takes less energy than 1 core? you are wrong
yes its true that power use per 1 core in tegra3 is low but overall its larger that single or dual core
darasz89 said:
4 core CPU takes less energy than 1 core? you are wrong
yes its true that power use per 1 core in tegra3 is low but overall its larger that single or dual core
Click to expand...
Click to collapse
Yes, its' use less power for normal usage.
It's obvious that if you use ALL 4 cores, the drain would be more than a single core CPU:
http://www.techrepublic.com/blog/hi...n-actually-use-less-power-than-dual-core/7976
Qualcomm as well is claiming that his next generation quad core phones will save 65% of battery life compare to the current ARM CPU:
http://www.theinquirer.net/inquirer...debuts-single-dual-quad-core-snapdragon-chips
speculations and promises of manufacturers
the major factor of power usage is system resources managment
at start new 4-core CPU will consume more energy beacuse of lack efficient support for 4 cores - but someday it will be optimized well
but the question is: do we really need 4 cores? i think that we need better optimized sotfware
did I loose my mind? look at devices running WP7.5 and tell me that Android is smoother [I'm not a M$ fanboy ]
now with 2cores we have big computing power and it should be use efficiently
darasz89 said:
speculations and promises of manufacturers
the major factor of power usage is system resources managment
at start new 4-core CPU will consume more energy beacuse of lack efficient support for 4 cores - but someday it will be optimized well
but the question is: do we really need 4 cores? i think that we need better optimized sotfware
did I loose my mind? look at devices running WP7.5 and tell me that Android is smoother [I'm not a M$ fanboy ]
now with 2cores we have big computing power and it should be use efficiently
Click to expand...
Click to collapse
I really hope is not speculation, cause we need battery life saving for our devices.
I am agree about the lack of apps optimized for a 4 cores CPU, however my I7-740Qm use less battery (I repeat on normal usage) than my old AMD Athlon XP, because of the improvement in architecture and technology such as Turbo Boost or Hyperthreading that Intel has implemented.
I believe the new 28nm CPU from Qualcomm and Nvidia will bring similar helpful features.
you're comparing normal CPU's
all the time I was talking about CPU on smartphones
difference between those two CPUs you mentioned is comparable to difference between Ford T and Ferrari Enzo
on smartphones we have limited resources - you cannot easily add f.e. 512 MB of RAM, change CPU or GPU
we have more limited hardware compared to PC
the goal is optimalization of software not adding cores to CPU or more RAM
Edit
Tegra2 and Tegra3 both have 40 nm process
Source
@OP
2011 is not a good year for hunting Tegra 3.
With only one product Asus TF Prime out, it could happen like the Xoom launch.
Although the prime seems like a very nice upgrade, I'm already reading about Lenovo and Acer bringing 1920*1200 res screens.
I know you may have your preference on your Manufacturer, I do too, but it's better to wait a bit to see all the players then choose.
I waited a bit and got a Galaxy Tab 10.1 one of the best tegra 2 tablets. Really interested in what Samsung can bring in 2012.
darasz89 said:
you're comparing normal CPU's
all the time I was talking about CPU on smartphones
difference between those two CPUs you mentioned is comparable to difference between Ford T and Ferrari Enzo
on smartphones we have limited resources - you cannot easily add f.e. 512 MB of RAM, change CPU or GPU
we have more limited hardware compared to PC
the goal is optimalization of software not adding cores to CPU or more RAM
Edit
Tegra2 and Tegra3 both have 40 nm process
Source
Click to expand...
Click to collapse
Yup that was just an example darasz89
Why do you think we have limited resources? Would you expect 2 years ago that one day we would use a smartphone big like 2 packs of cigarettes combined together, with a full OS installed, 4 cores, and 1GB of RAM?
Mobile technology is the future. We will see much more hopefully...
Yup, only Qualcomm for now will use 28nm technlogy.
priority of Android devices should be optimized software before hardware
I don't think that way you see the future of Android is bad, but first we should use all of "given power" of 2 core CPU and after that extend to 4core
funny summary: "With great power comes great responsibility."
Just pre-ordered prime from my local Future Shop. *Anxiously awaiting*
darasz89 said:
priority of Android devices should be optimized software before hardware
I don't think that way you see the future of Android is bad, but first we should use all of "given power" of 2 core CPU and after that extend to 4core
funny summary: "With great power comes great responsibility."
Click to expand...
Click to collapse
Actually that should be the priority of software in general, but wrriting good code is hard enough. Right now, ever since the software crisis, hardware has always been two three steps ahead of software, and right now it is in a way a good thing. You don't want a software to be lagging on your computer because its not powerful enough right? You'd rather have the extra juice to power any software with ease.
Hey guys!! According to bestbuy canada, the expected warehouse delivery date is December 5, 2011. Not going to check elsewhere, but just an FYI to all who asked.
Link

GPUs

I'm planning to buy a new android phone and my budget is 200 to 250 EUR.
The component thats bugging me a lot is the GPU. I am seeing old Adreno 200 GPUs on new phones like the Desire V.
#1-So is it really a factor that affects the overall performance of the phone?
#2-And which is the best?
I have seen phones equipped with Mali 400MP,Adreno 200,205,220 and 225,SGX 540...and those Tegra chips from LG Optimus Series.
Which one is the best?
#3-And the phone on my mind is Desire X(will be released soon),and many pages say that it comes with an Adreno 203 chip.Now whats Adreno 203?
And hows its performance?
Guys...
Sent from my GT-S5670 using xda app-developers app
yzak58 said:
I'm planning to buy a new android phone and my budget is 200 to 250 EUR.
The component thats bugging me a lot is the GPU. I am seeing old Adreno 200 GPUs on new phones like the Desire V.
#1-So is it really a factor that affects the overall performance of the phone?
#2-And which is the best?
I have seen phones equipped with Mali 400MP,Adreno 200,205,220 and 225,SGX 540...and those Tegra chips from LG Optimus Series.
Which one is the best?
#3-And the phone on my mind is Desire X(will be released soon),and many pages say that it comes with an Adreno 203 chip.Now whats Adreno 203?
And hows its performance?
Click to expand...
Click to collapse
GPU's are not the biggest factor no, as long as the CPU and RAM is enough overall performance will not be effected by the GPU.
Some games that are very 3D intensive would befit from a more powerful GPU yes, and for some games the Tegra 3 chip allows for better shading and water effects etc
thanks zac
GPU are saparated ram allocated for gaming..
More the gpu better the gaming performance...
It means 400mali is better than 200 adreno..
Other thing gpu does not effects over all performance but it effects clarity of graphics and display visualiTy...
So in 250 eur.
I Think galaxy S2 is good choice..
Good processor
Good gpu
Good screen resolution..
we all should be polite enough to press thanks for anyone who helped US.
i think ram comes first.
larger ram can make your phone work smoother(except games).
thanks
ok guys :good:
rainbow9 said:
i think ram comes first.
larger ram can make your phone work smoother(except games).
Click to expand...
Click to collapse
Actually its both . RAM also has a major impact on games. The better the GPU, the lower the impact on the RAM since the device won't need to be put under too much "strain" to process the graphics (also requiring a good CPU).
GPU IS IMPORTANT FOR SMOOTH OS PERFORMANCE. The current OS uses GPU acceleration to smooth things out ig. ICS and JB. Many ROMS also enable GPU to increase performance throughout the OS. If you have a snapdragon, then it uses RAM from the phone for RAM on the GPU where as Tegra has it's own dedicated RAM for its GPU.
AJ88 said:
GPU are saparated ram allocated for gaming..
More the gpu better the gaming performance...
It means 400mali is better than 200 adreno..
Other thing gpu does not effects over all performance but it effects clarity of graphics and display visualiTy...
So in 250 eur.
I Think galaxy S2 is good choice..
Good processor
Good gpu
Good screen resolution..
we all should be polite enough to press thanks for anyone who helped US.
Click to expand...
Click to collapse
The no. in the card's name does not reflect the ram it has.It reflects the model number.And of course the Mali mp-400 is better than the Adreno 200.It performs better than the Tegra 2.
Here's the performance order of previous generation chips :
Mali Mp-400>PowerVR SGX 540>Adreno 205 >> Tegra 2.
Maybe the Adreno 205 isn't THAT much better than the Tegra 2,but the Tegra 2 is highly over-rated,and the Mali mp-400 pulls cleanly ahead of it.
RoboWarriorSr said:
GPU IS IMPORTANT FOR SMOOTH OS PERFORMANCE. The current OS uses GPU acceleration to smooth things out ig. ICS and JB. Many ROMS also enable GPU to increase performance throughout the OS. If you have a snapdragon, then it uses RAM from the phone for RAM on the GPU where as Tegra has it's own dedicated RAM for its GPU.
Click to expand...
Click to collapse
oh..thanks for the info...
So you are saying that Tegra chips come with its own inbuilt RAM?
So...much mbs of RAM(or RAM equivalent or whatever) is in a Tegra chip?
yzak58 said:
oh..thanks for the info...
So you are saying that Tegra chips come with its own inbuilt RAM?
So...much mbs of RAM(or RAM equivalent or whatever) is in a Tegra chip?
Click to expand...
Click to collapse
I think it's 64MB or something like that for the Tegra 2.Doesn't really matter though.If you get anything better than the Adreno 200,it's good.
do Samsung galaxy mini has GPU?
Go for Tegra 3, mate
beakolang said:
do Samsung galaxy mini has GPU?
Click to expand...
Click to collapse
Yes, it does have a GPU. It has an Adreno 200.
You know, I have been using for some time a(pretty old by now) LG Optimus One. It has an Adreno 200 GPU and an ARMv6 600 Mhz CPU.
Even if I overclock it to 800Mhz and maximize the ROM performance in every way possible, GTA3 for example runs pretty much non-playable(very low FPS).
The Optimus One uses a Qualcomm MSM7227 SoC(2009). But in 2011 Qualcomm released the MSM7227A(used for example in Galaxy Mini 2) which also has an Adreno 200 for GPU, but it uses a much better ARMv7 800Mhz Cortex-A5 CPU. The GPU coupled with this much more capable CPU handles GTA 3 really good, playable without problems.
That's really interesting to me, to say the least. It's like you would have a good video card in your PC, but it was bottlenecked by the CPU. And Adreno 200 is quite old.
-
nundoo said:
You know, I have been using for some time a(pretty old by now) LG Optimus One. It has an Adreno 200 GPU and an ARMv6 600 Mhz CPU.
Even if I overclock it to 800Mhz and maximize the ROM performance in every way possible, GTA3 for example runs pretty much non-playable(very low FPS).
The Optimus One uses a Qualcomm MSM7227 SoC(2009). But in 2011 Qualcomm released the MSM7227A(used for example in Galaxy Mini 2) which also has an Adreno 200 for GPU, but it uses a much better ARMv7 800Mhz Cortex-A5 CPU. The GPU coupled with this much more capable CPU handles GTA 3 really good, playable without problems.
That's really interesting to me, to say the least. It's like you would have a good video card in your PC, but it was bottlenecked by the CPU. And Adreno 200 is quite old.
-
Click to expand...
Click to collapse
It has an enhanced Adreno 200.That's how it gets better graphics score in AnTuTu.And I'm surprised you can't run GTA3.I can play Dead Space no lag on my Wildfire S,even at stock,and that looks just as intensive as GTA3.
Although I do agree the CPU might be a bottleneck,it shouldn't affect 3D gaming.The UI becomes really smooth @ 825Mhz,which surprises me as it lags in comparison at even 806Mhz.
Dead Space also runs very good on Optimus One, GTA 3 is much more demanding.
It has to do with the fact that GTA is an open world game which requires more background processing rather than current processing that the majority of android games use. I believe that the CPU does the background processing which is why it lags. This also explains why the galaxy mini can play GTA while having a similar clocked CPU, the architecture.
soo
Soo is the desire x better than tegra 2?
Or more detailt
Htc desire x is it better than my lg optimus 2x.
Htc has more ram. But i dont like that i has the adreno 203 is it ****?
Help plz

TEGRA 4 - 1st possible GLBenchmark!!!!!!!! - READ ON

Who has been excited by the Tegra 4 rumours?, last night's Nvidia CES announcement was good, but what we really want are cold-hard BENCHMARKS.
I found an interesting mention of Tegra T114 SoC on a Linux Kernel site, which I've never heard of. I got really interested when it stated that the SoC is based on ARM A15 MP, it must be Tegra 4. I checked the background of the person who posted the kernel patch, he is a senior Nvidia Kernel engineer based in Finland.
https://lkml.org/lkml/2012/12/20/99
"This patchset adds initial support for the NVIDIA's new Tegra 114
SoC (T114) based on the ARM Cortex-A15 MP. It has the minimal support
to allow the kernel to boot up into shell console. This can be used as
a basis for adding other device drivers for this SoC. Currently there
are 2 evaluation boards available, "Dalmore" and "Pluto"."
On the off chance I decided to search www.glbenchmark.com for the 2 board names, Dalmore (a tasty whisky!) and Pluto (Planet, Greek God and cartoon dog!) Pluto returned nothing, but Dalmore returned a device called 'Dalmore Dalmore' that was posted on 3rd January 2013. However the OP had already deleted them, but thanks to Google Cache I found the results
RESULTS
GL_VENDOR NVIDIA Corporation
GL_VERSION OpenGL ES 2.0 17.01235
GL_RENDERER NVIDIA Tegra
From System spec, It runs Android 4.2.1, a Min frequency of 51 MHz and Max of 1836 Mhz
Nvidia DALMORE
GLBenchmark 2.5 Egypt HD C24Z16 - Offscreen (1080p) : 32.6 fps
iPad 4
GLBenchmark 2.5 Egypt HD C24Z16 - Offscreen (1080p): 49.6 fps
CONCLUSION
Anandtech has posted that Tegra 4 doesn't use unified shaders, so it's not based on Kepler. I reckon that if Nvidia had a brand new GPU they would have shouted about it at CES, the results I've found indicate that Tegra 4 is between 1 to 3 times faster than Tegra 3.
BUT, this is not 100% guaranteed to be a Tegra 4 system, but the evidence is strong that it is a T4 development board. If this is correct, we have to figure that it is running beta drivers, Nexus 10 is ~ 10% faster than the Arndale dev board with the same Exynos 5250 SoC. Even if Tegra 4 gets better drivers, it seems like the SGX 544 MP4 in the A6X is still the faster GPU, with Tegra 4 and Mali T604 being an almost equal 2nd. Nvidia has said that T4 is faster than A6X, but the devil is in the detail, in CPU benchmarks I can see that being true, but not for graphics.
UPDATE - Just to add to the feeling that that this legit, the GLBenchmark - System section lists the "android.os.Build.USER" as buildbrain. Buildbrain according to a Nvidia job posting is "Buildbrain is a mission-critical, multi-tier distributed computing system that performs mobile builds and automated tests each day, enabling NVIDIA's high performance development teams across the globe to develop and deliver NVIDIA's mobile product line"
http://jobsearch.naukri.com/job-lis...INEER-Nvidia-Corporation--2-to-4-130812500024
I posted the webcache links to GLBenchmark pages below, if they disappear from cache, I've saved a copy of the webpages, which I can upload, Enjoy
GL BENCHMARK - High Level
http://webcache.googleusercontent.c...p?D=Dalmore+Dalmore+&cd=1&hl=en&ct=clnk&gl=uk
GL BENCHMARK - Low Level
http://webcache.googleusercontent.c...e&testgroup=lowlevel&cd=1&hl=en&ct=clnk&gl=uk
GL BENCHMARK - GL CONFIG
http://webcache.googleusercontent.c...Dalmore&testgroup=gl&cd=1&hl=en&ct=clnk&gl=uk
GL BENCHMARK - EGL CONFIG
http://webcache.googleusercontent.c...almore&testgroup=egl&cd=1&hl=en&ct=clnk&gl=uk
GL BENCHMARK - SYSTEM
http://webcache.googleusercontent.c...ore&testgroup=system&cd=1&hl=en&ct=clnk&gl=uk
OFFSCREEN RESULTS
http://webcache.googleusercontent.c...enchmark.com+dalmore&cd=4&hl=en&ct=clnk&gl=uk
http://www.anandtech.com/show/6550/...00-5th-core-is-a15-28nm-hpm-ue-category-3-lte
Is there any Gpu that could outperform iPad4 before iPad5 comes out? adreno 320, t Mali 604 now tegra 4 aren't near it. Qualcomm won't release anything till q4 I guess, and tegra 4 has released too only thing that is left is I guess is t Mali 658 coming with exynos 5450 (doubtfully when it would release, not sure it will be better )
Looks like apple will hold the crown in future too .
i9100g user said:
Is there any Gpu that could outperform iPad4 before iPad5 comes out? adreno 320, t Mali 604 now tegra 4 aren't near it. Qualcomm won't release anything till q4 I guess, and tegra 4 has released too only thing that is left is I guess is t Mali 658 coming with exynos 5450 (doubtfully when it would release, not sure it will be better )
Looks like apple will hold the crown in future too .
Click to expand...
Click to collapse
There was a great article on Anandtech that tested the power consumption of the Nexus 10's Exynos 5250 SoC, it showed that both the CPU and GPU had a TDP of 4W, making a theoretical SoC TDP of 8W. However when the GPU was being stressed by running a game, they ran a CPU benchmark in the background, the SoC quickly went up to 8W, but the CPU was quickly throttled from 1.7 GHz to just 800 Mhz as the system tried to keep everything at 4W or below, this explained why the Nexus 10 didn't benchmark as well as we wished.
Back to the 5450 which should beat the A6X, trouble is it has double the CPU & GPU cores of the 5250 and is clocked higher, even on a more advanced 28nm process, which will lower power consumption I feel that system will often be throttled because of power and maybe heat concerns, so it looks amazing on paper but may disappoint in reality, and a 5450 in smartphone is going to suffer even more.
So why does Apple have an advantage?, well basically money, for a start mapple fans will pay more for their devices, so they afford to design a big SoC and big batteries that may not be profitable to other companies. Tegra 4 is listed as a 80mm2 chip, iPhone 5 is 96mm2 and A6X is 123mm2, Apple can pack more transistors and reap the GPU performance lead, also they chosen graphics supplier Imagination Technologies have excellent products, Power VR Rogue will only increase Apple's GPU lead. They now have their own chip design team, the benefit for them has been their Swift core is almost as powerful as ARM A15, but seems less power hungry, anyway Apple seems to be happy running slower CPUs compared to Android. Until an Android or WP8 or somebody can achieve Apple's margins they will be able to 'buy' their way to GPU domination, as an Android fan it makes me sad:crying:
32fps is no go...lets hope it's not final
hamdir said:
32fps is no go...lets hope it's not final
Click to expand...
Click to collapse
It needs to, but it will be OK for a new Nexus 7
still faster enough for me, I dont game alot on my nexus 7.
I know I'm taking about phones here ... But the iPhone 5 GPU and adreno 320 are very closely matched
Sent from my Nexus 4 using Tapatalk 2
italia0101 said:
I know I'm taking about phones here ... But the iPhone 5 GPU and adreno 320 are very closely matched
Sent from my Nexus 4 using Tapatalk 2
Click to expand...
Click to collapse
From what I remember the iPhone 5 and the new iPad wiped the floor with Nexus 4 and 10. The ST-Ericsson Nova A9600 is likely to have a PowerVR Rogue GPU. Just can't wait!!
adityak28 said:
From what I remember the iPhone 5 and the new iPad wiped the floor with Nexus 4 and 10. The ST-Ericsson Nova A9600 is likely to have a PowerVR Rogue GPU. Just can't wait!!
Click to expand...
Click to collapse
That isn't true , check glbenchmark , in the off screen test the iPhone scored 91 , the nexus 4 scored 88 ... That ksnt wiping my floors
Sent from my Nexus 10 using Tapatalk HD
Its interesting how even though nvidia chips arent the best we still get the best game graphics because of superior optimization through tegra zone. Not even the a6x is as fully optimized.
Sent from my SAMSUNG-SGH-I727 using xda premium
ian1 said:
Its interesting how even though nvidia chips arent the best we still get the best game graphics because of superior optimization through tegra zone. Not even the a6x is as fully optimized.
Sent from my SAMSUNG-SGH-I727 using xda premium
Click to expand...
Click to collapse
What sort of 'optimisation' do you mean? un optimised games lag that's a big letdown and tegra effects can also be used on other phones too with chain fire 3d I use it and tegra games work without lag with effects and I don't have a tegra device
With a tegra device I am restricted to optimised games mostly
The graphic performance of NVIDIA SoCs is always disappointed, sadly for the VGA dominanting provider on the world.
The first Tegra2, the GPU is a little bit better than SGX540 of GalaxyS a little bit in benchmark, but lacking NEON support.
The second one Tegra 3, the GPU is nearly the same as the old Mali400Mp4 in GALAXY S2/Original Note.
And now it's better but still nothing special and outperformed soon (Adreno 330 and next-gen Mali)
Strongest PowerVR GPUs are always the best, but sadly they are exclusive for Apple only (SGX543 and maybe SGX 554 also, only Sony ,who has the cross-licencing with Apple, has it in PS Vita and in PS Vita only)
tegra optimization porting no longer works using chainfire, this is now a myth
did u manage to try shadowgun thd, zombie driver or horn? the answer is no, games that use t3 sdk for physx and other cpu graphics works can not be forced to work on other devices, equally chainfire is now outdated and no longer updated
now about power vr they are only better in real multicore configuration which is only used by apple and Sony's vita, eating large die area, ie actual multicore each with its own subcores/shaders, if tegra was used in real multi core it would destroy all
finally this is really funny all this doom n gloom because of an early discarded development board benchmark, I dont mean to take away from turbo's thunder and his find but truly its ridiculous the amount of negativity its is collecting before any type of final device benchs
adrena 220 doubled in performance after the ICS update on sensation
t3 doubled the speed of t2 gpu with only 50% the number of shaders so how on earth do you believe only 2x the t3 scores with 600% more shaders!!
do you have any idea how miserable the ps3 performed in its early days? even new desktop GeForces perform much less than expected until the drivers are updated
enough with the FUD! seems this board is full of it nowadays and so little reasoning...
For goodness sake, this isn't final hardware, anything could change. Hung2900 knows nothing, what he stated isn't true. Samsung has licensed PowerVR, it isn't just stuck to Apple, just that Samsung prefers using ARMs GPU solution. Another thing I dislike is how everyone is comparing a GPU in the iPad 4 (SGX554MP4) that will NEVER arrive in a phone compared a Tegra 4 which will arrive in a phone. If you check OP link the benchmark was posted on the 3rd of January with different results (18fps then 33fps), so there is a chance it'll rival the iPad 4. I love Tegra as Nvidia is pushing developers to make more better games for Android compared to the 'geeks' *cough* who prefers benchmark results, whats the point of having a powerful GPU if the OEM isn't pushing developers to create enhance effect games for there chip.
Hamdir is correct about the GPUs, if Tegra 3 was around 50-80% faster than Tegra 2 with just 4 more cores, I can't really imagine it only being 2x faster than Tegra 3. Plus its a 28nm (at around 80mm2 just a bit bigger than Tegra 3, smaller than A6 90mm2) along with the dual memory than single on Tegra 2/3.
Turbotab said:
There was a great article on Anandtech that tested the power consumption of the Nexus 10's Exynos 5250 SoC, it showed that both the CPU and GPU had a TDP of 4W, making a theoretical SoC TDP of 8W. However when the GPU was being stressed by running a game, they ran a CPU benchmark in the background, the SoC quickly went up to 8W, but the CPU was quickly throttled from 1.7 GHz to just 800 Mhz as the system tried to keep everything at 4W or below, this explained why the Nexus 10 didn't benchmark as well as we wished.
Back to the 5450 which should beat the A6X, trouble is it has double the CPU & GPU cores of the 5250 and is clocked higher, even on a more advanced 28nm process, which will lower power consumption I feel that system will often be throttled because of power and maybe heat concerns, so it looks amazing on paper but may disappoint in reality, and a 5450 in smartphone is going to suffer even more.
So why does Apple have an advantage?, well basically money, for a start iSheep will pay more for their devices, so they afford to design a big SoC and big batteries that may not be profitable to other companies. Tegra 4 is listed as a 80mm2 chip, iPhone 5 is 96mm2 and A6X is 123mm2, Apple can pack more transistors and reap the GPU performance lead, also they chosen graphics supplier Imagination Technologies have excellent products, Power VR Rogue will only increase Apple's GPU lead. They now have their own chip design team, the benefit for them has been their Swift core is almost as powerful as ARM A15, but seems less power hungry, anyway Apple seems to be happy running slower CPUs compared to Android. Until an Android or WP8 or somebody can achieve Apple's margins they will be able to 'buy' their way to GPU domination, as an Android fan it makes me sad:crying:
Click to expand...
Click to collapse
Well said mate!
I can understand what you feel, nowdays android players like samsung,nvidia are focusing more on CPU than GPU.
If they won't stop soon and continued to use this strategy they will fail.
GPU will become bottleneck and you will not be able use the cpu at its full potential. (Atleast when gaming)
i have Galaxy S2 exynos 4 1.2Ghz and 400mhz oc mali gpu
In my analysis most modern games like MC4,NFS:MW aren't running at 60FPS at all thats because GPU always have 100% workload and CPU is relaxing there by outputing 50-70% of total CPU workload
I know some games aren't optimize for all android devices as opposed to apple devices but still even high-end android devices has slower gpu (than ipad 4 atleast )
AFAIK, Galaxy SIV is likely to pack T-604 with some tweaks instead of mighty T-658 which is still slower than iPAddle 4
Turbotab said:
There was a great article on Anandtech that tested the power consumption of the Nexus 10's Exynos 5250 SoC, it showed that both the CPU and GPU had a TDP of 4W, making a theoretical SoC TDP of 8W. However when the GPU was being stressed by running a game, they ran a CPU benchmark in the background, the SoC quickly went up to 8W, but the CPU was quickly throttled from 1.7 GHz to just 800 Mhz as the system tried to keep everything at 4W or below, this explained why the Nexus 10 didn't benchmark as well as we wished.
Back to the 5450 which should beat the A6X, trouble is it has double the CPU & GPU cores of the 5250 and is clocked higher, even on a more advanced 28nm process, which will lower power consumption I feel that system will often be throttled because of power and maybe heat concerns, so it looks amazing on paper but may disappoint in reality, and a 5450 in smartphone is going to suffer even more.
So why does Apple have an advantage?, well basically money, for a start iSheep will pay more for their devices, so they afford to design a big SoC and big batteries that may not be profitable to other companies. Tegra 4 is listed as a 80mm2 chip, iPhone 5 is 96mm2 and A6X is 123mm2, Apple can pack more transistors and reap the GPU performance lead, also they chosen graphics supplier Imagination Technologies have excellent products, Power VR Rogue will only increase Apple's GPU lead. They now have their own chip design team, the benefit for them has been their Swift core is almost as powerful as ARM A15, but seems less power hungry, anyway Apple seems to be happy running slower CPUs compared to Android. Until an Android or WP8 or somebody can achieve Apple's margins they will be able to 'buy' their way to GPU domination, as an Android fan it makes me sad:crying:
Click to expand...
Click to collapse
Typical "isheep" reference, unnecessary.
Why does apple have the advantage? Maybe because there semiconductor team is talented and can tie the A6X+PowerVR GPU efficiently. NIVIDA should have focused more on GPU in my opinion as the CPU was already good enough. With these tablets pushing excess of 250+ppi the graphics processor will play a huge role. They put 72 cores in there processor. Excellent. Will the chip ever be optimized to full potential? No. So again they demonstrated a product that sounds good on paper but real world performance might be a different story.
MrPhilo said:
For goodness sake, this isn't final hardware, anything could change. Hung2900 knows nothing, what he stated isn't true. Samsung has licensed PowerVR, it isn't just stuck to Apple, just that Samsung prefers using ARMs GPU solution. Another thing I dislike is how everyone is comparing a GPU in the iPad 4 (SGX554MP4) that will NEVER arrive in a phone compared a Tegra 4 which will arrive in a phone. If you check OP link the benchmark was posted on the 3rd of January with different results (18fps then 33fps), so there is a chance it'll rival the iPad 4. I love Tegra as Nvidia is pushing developers to make more better games for Android compared to the 'geeks' *cough* who prefers benchmark results, whats the point of having a powerful GPU if the OEM isn't pushing developers to create enhance effect games for there chip.
Hamdir is correct about the GPUs, if Tegra 3 was around 50-80% faster than Tegra 2 with just 4 more cores, I can't really imagine it only being 2x faster than Tegra 3. Plus its a 28nm (at around 80mm2 just a bit bigger than Tegra 3, smaller than A6 90mm2) along with the dual memory than single on Tegra 2/3.
Click to expand...
Click to collapse
Firstly please keep it civil, don't go around saying that people know nothing, people's posts always speak volumes. Also calling people geeks, on XDA is that even an insult, next you're be asking what I deadlift:laugh:
My OP was done in the spirit of technical curiosity, and to counter the typical unrealistic expectations of a new product on mainstream sites, e.g. Nvidia will use Kepler tech (which was false), omg Kepler is like GTX 680, Tegra 4 will own the world, people forget that we are still talking about device that can only use a few watts, and must be passively cooled and not a 200+ watt, dual-fan GPU, even though they both now have to power similar resolutions, which is mental.
I both agree and disagree with your view on Nvidia's developer relationship, THD games do look nice, I compared Infinity Blade 2 on iOS vs Dead Trigger 2 on youtube, and Dead Trigger 2 just looked richer, more particle & physics effects, although IF Blade looked sharper at iPad 4 native resolution, one of the few titles to use the A6x's GPU fully.The downside to this relationship is the further fragmentation of the Android ecosystem, as Chainfire's app showed most of the extra effects can run on non Tegra devices.
Now, a 6 times increase in shader, does not automatically mean that games / benchmarks will scale in linear fashion, as other factors such as TMU /ROP throughput can bottleneck performance. Nvidia's Technical Marketing Manager, when interviewed at CES, said that the overall improvement in games / benchmarks will be around 3 to 4 times T3. Ultimately I hope to see Tegra 4 in a new Nexus 7, and if these benchmarks are proved accurate, it wouldn't stop me buying. Overall including the CPU, it would be a massive upgrade over the current N7, all in the space of a year.
At 50 seconds onwards.
https://www.youtube.com/watch?v=iC7A5AmTPi0
iOSecure said:
Typical "isheep" reference, unnecessary.
Why does apple have the advantage? Maybe because there semiconductor team is talented and can tie the A6X+PowerVR GPU efficiently. NIVIDA should have focused more on GPU in my opinion as the CPU was already good enough. With these tablets pushing excess of 250+ppi the graphics processor will play a huge role. They put 72 cores in there processor. Excellent. Will the chip ever be optimized to full potential? No. So again they demonstrated a product that sounds good on paper but real world performance might be a different story.
Click to expand...
Click to collapse
Sorry Steve, this is an Android forum, or where you too busy buffing the scratches out of your iPhone 5 to notice? I have full respect for the talents of Apple's engineers & marketing department, many of its users less so.
hamdir said:
tegra optimization porting no longer works using chainfire, this is now a myth
did u manage to try shadowgun thd, zombie driver or horn? the answer is no, games that use t3 sdk for physx and other cpu graphics works can not be forced to work on other devices, equally chainfire is now outdated and no longer updated
Click to expand...
Click to collapse
Looks like they haven't updated chain fire 3d for a while as a result only t3 games don't work but others do work rip tide gp, dead trigger etc . It's not a myth but it is outdated and only works with ics and tegra 2 compatible games . I think I (might be) unfortunate too but some gameloft games lagged on tegra device that i had, though root solved it too an extent
I am not saying something is superior to something just that my personal experience I might be wrong I may not be
Tbh I think benchmarks don't matter much unless you see some difference in real world usage and I had that problem with tegra in my experience
But we will have to see if the final version is able to push it above Mali t 604 and more importantly sgx544
Turbotab said:
Sorry Steve, this is an Android forum, or where you too busy buffing the scratches out of your iPhone 5 to notice? I have full respect for the talents of Apple's engineers & marketing department, many of its users less so.
Click to expand...
Click to collapse
No I actually own a nexus 4 and ipad mini so I'm pretty neutral in googles/apples ecosystems and not wiping any scratches off my devices.

Exynos Note 3 possibly won't get HMP update from Samsung

Bad news for fellow Exynos users.
http://www.phonearena.com/news/Note...t-core-performance-patch-says-Samsung_id47977
It seems that Exynos 5420 is capable of HMP but will get too hot for the chip to handle.
But the Samsung Engineer does mention that they would comment the HMP update only after a complete testing process to ensure trouble free operation.
Note 3 and Galaxy S4 are unlikely to receive the full octa-core power in their Exynos chipset versions, advised a chief technical expert from Samsung's Mobile Solutions department. Recently the company said that it is able to unleash all the eight cores working at once, which can bump performance significantly, compared to the maximum of four cores restriction we have now with Exynos 5 Octa in these handsets.
The thing is, the engineer comments, that even though Samsung can release a software patch that will allow both the quad-core Cortex-A15 set, and the frugal Cortex-A7 cores, to get together for a task, the thermal envelope of these Exynos chips hasn't been cut for the job.
Click to expand...
Click to collapse
i think this is a good move, they can't force this and overheat the phones.
im tempted to buy the exynos here in saudi as they are cheaper than the snapdragon ones even without the 4k part....
system.img said:
Bad news for fellow Exynos users.
http://www.phonearena.com/news/Note...t-core-performance-patch-says-Samsung_id47977
It seems that Exynos 5420 is capable of HMP but will get too hot for the chip to handle.
Click to expand...
Click to collapse
out of 3 aspects of big little(core migration,cluster migration,hmp), core migration is the most important aspect, since the hardware is fixed any custome kernel for exynos note 3 moght have core migration which will surely increase the battery efficiency
bala_gamer said:
out of 3 aspects of big little(core migration,cluster migration,hmp), core migration is the most important aspect, since the hardware is fixed any custome kernel for exynos note 3 moght have core migration which will surely increase the battery efficiency
Click to expand...
Click to collapse
So Exynos 5420 has Core migration instead of cluster migration?
bala_gamer said:
out of 3 aspects of big little(core migration,cluster migration,hmp), core migration is the most important aspect, since the hardware is fixed any custome kernel for exynos note 3 moght have core migration which will surely increase the battery efficiency
Click to expand...
Click to collapse
Nah. HMP is the real thing. Said Andreilux
HMP is extremely useful for power efficiency because you can migrate stupidly faster than DVFS allows you to.
Click to expand...
Click to collapse
Sent from My GT i9300
If its true, then its very dissapointing step from Samsung
My question is, if there is so much problems in that chip then why he is selling these faulty chip...and why in india, asian and african small countries where government dont do anything agains this kind of behaviour...and costmers also ignoring these issue..which not good for future.
1 many problems in s4
2 sim locking
3 knockon
....many more ..and some may be coming soon
And now no HMP update for s4 or note 3 ...seems like samsung has over confident of their market share and profit...
Isnt this is kind of monopoly???
ipsuvedi said:
If its true, then its very dissapointing step from Samsung
My question is, if there is so much problems in that chip then why he is selling these faulty chip...and why in india, asian and african small countries where government dont do anything agains this kind of behaviour...and costmers also ignoring these issue..which not good for future.
1 many problems in s4
2 sim locking
3 knockon
....many more ..and some may be coming soon
And now no HMP update for s4 or note 3 ...seems like samsung has over confident of their market share and profit...
Isnt this is kind of monopoly???
Click to expand...
Click to collapse
Samsung wants to maximise profits by selling phones made with its in-house chip but is not able to integrate LTE in the chip. So, all it can do is to sell the phone with the Exynos chip in non-LTE areas.
But still, I think the Exynos is good too. It's CPU is damn powerful(considering a clock speed of only 1.9 Ghz)
Thermal problems are a poor excuse considering cluster migration is going to be much worse for thermals.
It's also a poor excuse considering HMP is best for power efficiency, meaning over all temperatures should be lower.
Unless they're worried that people will benchmark each thread and burn the chip out, but all they need to do is put thermal throttles and speed throttles with large A15 use.
I don't want to say Samsung are lazy because that's simply a stupid thing to say. Obviously all of this is very difficult and Samsung don't have the right combination and amount of time, talent and money to make it happen.
Core migration is going to be fine, anyway. If they ever bother with that.
Still disappointing anyway :/
Sent from my GT-N7100 using xda premium
SirCanealot said:
Thermal problems are a poor excuse considering cluster migration is going to be much worse for thermals.
It's also a poor excuse considering HMP is best for power efficiency, meaning over all temperatures should be lower.
Unless they're worried that people will benchmark each thread and burn the chip out, but all they need to do is put thermal throttles and speed throttles with large A15 use.
I don't want to say Samsung are lazy because that's simply a stupid thing to say. Obviously all of this is very difficult and Samsung don't have the right combination and amount of time, talent and money to make it happen.
Core migration is going to be fine, anyway. If they ever bother with that.
Still disappointing anyway :/
Sent from my GT-N7100 using xda premium
Click to expand...
Click to collapse
They have the Linaro team to do the work. They have already accomplished it but it has been shown off with only a prototype tablet.
ipsuvedi said:
If its true, then its very dissapointing step from Samsung
My question is, if there is so much problems in that chip then why he is selling these faulty chip...and why in india, asian and african small countries where government dont do anything agains this kind of behaviour...and costmers also ignoring these issue..which not good for future.
1 many problems in s4
2 sim locking
3 knockon
....many more ..and some may be coming soon
And now no HMP update for s4 or note 3 ...seems like samsung has over confident of their market share and profit...
Isnt this is kind of monopoly???
Click to expand...
Click to collapse
Well, someone write a news without any confermation and users say it's good
So.. now.. i'm going to write a news.. santa is real! And all beleave in it..
Too many sheeps in the world
Big little is over any other arch.. it has the best efficiency ever.. and will have it with hmp..
Yup, if you turn on all 8 cores in maxfreq for 10 minutes.. phone burns.. but.. the logic of big little is: use low power cores (A7) for low tasks.. use high power cores (a15) for hugh tasks..
Than.. now you are using the cluater migration, it has 2 "bugs"
1) all cluster switch, so, is alminented the A15 even if it's not necessary, it happes when 1 core goes to A15
2) the switch from 2 clusters blocks computation for little time
The core migration fix the first problem of cluster migration
The hmp fix each 2 "problems"
So.. if cluater migration is good.. core migration is better, and hmp is better and better
Hmp has high overheating? Well, so n3 with cluster will have more overheating.. all with n3 has this issue?
for me.. all of you use too less the brain.. brain is a muscle.. use it!
I do not find thermal envelope explanation for HMP logical. Cluster Migration switches all the cores to A15 even when one thread requires power hence this design is more battery hungry. If Samsung is really worried about crossing thermal envelope, then they can implement something like Intel has done which they call it Turbo Boost. They can effectively reduce max clock speed to 1.5 GHz when all A15 are running but allow it to run to 1.9 GHz when only 1 or two threads are running.
If Samsung refuses to do so, I hope developers find ways to unlock HMP. It is not that I need 8 cores running simultaneously when my laptop hums at 1.3GHz in dual core mode, but when Samsung teases us and there is treasure hidden ready for unlocking, then it is just human nature to want MORE.
voice of my heart
Absolutely right brother. The die hard snapdragon fans can not digest the Exynos big Little processing and just throwing out rumors and I am really shocked how people believe it. I saw in many discussions that readers and mostly writers were not even software or hardware literate they were just speaking and forwarding the rumors. Actually due to lack of sdk from Samsung for exynos the third party custom rom writers can not do much in exynos as they were able to do in snapdragon so this makes them angry and they spread rumors. I have note 3 SM900 exynos and previously I had s4 exynos one. Did not face any problem in s4 ever and it's my 4th day with note 3 and going great so far. In my thinking the little big processing is better than all 8 cores working at the same time and people should see that the big a15 quad core 1.9 ghz gave very very closer benchmarks test results to Snapdragon 800 2.3 ghz and some were higher. People don't understand the chip architecture and just play on rumors.
iba21 said:
Well, someone write a news without any confermation and users say it's good
So.. now.. i'm going to write a news.. santa is real! And all beleave in it..
Too many sheeps in the world
Big little is over any other arch.. it has the best efficiency ever.. and will have it with hmp..
Yup, if you turn on all 8 cores in maxfreq for 10 minutes.. phone burns.. but.. the logic of big little is: use low power cores (A7) for low tasks.. use high power cores (a15) for hugh tasks..
Than.. now you are using the cluater migration, it has 2 "bugs"
1) all cluster switch, so, is alminented the A15 even if it's not necessary, it happes when 1 core goes to A15
2) the switch from 2 clusters blocks computation for little time
The core migration fix the first problem of cluster migration
The hmp fix each 2 "problems"
So.. if cluater migration is good.. core migration is better, and hmp is better and better
Hmp has high overheating? Well, so n3 with cluster will have more overheating.. all with n3 has this issue?
for me.. all of you use too less the brain.. brain is a muscle.. use it!
Click to expand...
Click to collapse
willstay said:
I do not find thermal envelope explanation for HMP logical. Cluster Migration switches all the cores to A15 even when one thread requires power hence this design is more battery hungry. If Samsung is really worried about crossing thermal envelope, then they can implement something like Intel has done which they call it Turbo Boost. They can effectively reduce max clock speed to 1.5 GHz when all A15 are running but allow it to run to 1.9 GHz when only 1 or two threads are running.
If Samsung refuses to do so, I hope developers find ways to unlock HMP. It is not that I need 8 cores running simultaneously when my laptop hums at 1.3GHz in dual core mode, but when Samsung teases us and there is treasure hidden ready for unlocking, then it is just human nature to want MORE.
Click to expand...
Click to collapse
People doesen't understand what is an hotplug and how it works
The real goal of that arch is called "power gating".. simply it's a technique developed by intel witch AUTO shuts down transistors if those are not in use..
The hotplug uses a software decision to shut down cores.. it's not hardware..
the difference?
Simply, for linux, all cores are everytime turned on, even if the transistors of the core are shutted down.. that's prevent time spent to re-schedule the tasks.. and sure.. linux is a multithreading kernel, it means, more core = more parallelization = less frequency = less power usage
That's the real goal of big.little!
And you understand, if there are 8 cores.. tasks will be shared on more cores, it means it has the best efficiency ever
An ex.. phone ALWAYS have low tasks to be compilet to permit the phone using.. as like the audio task, video task, or, wireless taks.. well, why use an high performance arch for low performance tasks? That's why arm creates A7..
CORTEX.A7 has the best efficiency ever.. so.. the same task in a cortexA7 is compiled with LESS ENERGY than other arch.. so.. it's a perfect way to preserve energy
Sure, A7 is not a performance arch, that's why arm choses to create the A15.. when tasks are too high to be compiled with an A7.. system auto switch task on a A15.. so.. the goal is.. low tasks in A7 (best efficiency) and high tasks in A15 (best performance)
Mix best efficiency + best performance = best arch :thumbup:
Got a feeling they're testing the HMP tech / heat / performance / power consump in NOTE 3 already .........
They were brave enough to have it on display at an exhibition ..... the update seems promising from what I can see!
On the "tune.pk" site there's a "video 1058551" showing "Samsung-Exynos5420-HMP-bigLITTLE-demo" with all 8-cores running!
Hoping to see some actual benchmarks so we know what the before / after results are like!!!
gudodayn said:
Got a feeling they're testing the HMP tech / heat / performance / power consump in NOTE 3 already .........
They were brave enough to have it on display at an exhibition ..... the update seems promising from what I can see!
On the "tune.pk" site there's a "video 1058551" showing "Samsung-Exynos5420-HMP-bigLITTLE-demo" with all 8-cores running!
Hoping to see some actual benchmarks so we know what the before / after results are like!!!
Click to expand...
Click to collapse
I'd be inclined to agree with you. The Lite/Neo Note 3 has it enabled for all 6 cores apparently according to Sammobile
radicalisto said:
I'd be inclined to agree with you. The Lite/Neo Note 3 has it enabled for all 6 cores apparently according to Sammobile
Click to expand...
Click to collapse
The lady from Samsung and Samsung wasnt hiding the Note3 device either ……
My contract is up soon ……
I hope benchmarks show up soon so I can get an idea of either the LTE or 3G one to get!
If you see some of the tweets of SamsungExynos on twitter here https://twitter.com/SamsungExynos it's easy to have a wild guess HMP will be soon released, just a wild guess though looking at their tweets.
radicalisto said:
I'd be inclined to agree with you. The Lite/Neo Note 3 has it enabled for all 6 cores apparently according to Sammobile
Click to expand...
Click to collapse
5260 in note3 lite is obbly to be used with hmp.. why?
Cluster and core migrations may have the equal numbers of cores for each clusters, or kernel code won't works
I don't think if samsung will release hmp for the n900 or not.. the only thing i could say is: if samsung will relase a good source code.. I will extract the hmp from the note3 lite code , modify, and add it in my own kernel build.. and this for all kernel modders..
sure, i hope for a direct hmp upgrade by samsung, it will be better, but, we'll do all to try to run it..
I tryed to compile a code takken from internet witch tryes to load hmp.. but.. no possibilities due to an incompatibility with too mouch dependencies of the kernel code, traduct, actual source code is RUBBISH!
We are running universal drivers without any CCI code nor the real base of hmp..
source code core and cluster migration were writes TWO YEARS AGO!
We are runnin a code wroted for the 5410!! Not the 5420!!
That's why our exy suks compared to hmp features!
**** you samsung
Why do we want HMP activated ? The Android OS is not that advanced to be able to manage heat & conserve batteries even on KitKat. Enabling HMP will only ruin the phone hardware and shorting its age. Decision must have been made with careful reasonings.

Categories

Resources