2011 Tegra 3 (Kal-El) Tablets - General Topics

Hey guys,
Could someone please let me know which tablets are being/have been released in 2011 sporting the Tegra 3 (Kal-El) processor? Especially if they're out/easily importable to the UK! Thanks
*EDIT* The one I'm aware of atm is the Asus Transformer Prime - does anyone have a definite release day for this?

Prime is the only one that's anounced yet!

http://www.engadget.com/2011/11/09/transformer-prime-detailed-10-inch-super-ips-display-12-hour
"December" is still all I've heard of a release date. But it supposed to be a worldwide release so I'm assuming shortages will be a given.

mordbane said:
http://www.engadget.com/2011/11/09/transformer-prime-detailed-10-inch-super-ips-display-12-hour
"December" is still all I've heard of a release date. But it supposed to be a worldwide release so I'm assuming shortages will be a given.
Click to expand...
Click to collapse
Ah cool, so I'll be able to order it from the UK? Hoping to get it by xmas ya know

Prime in december

Cyrano4 said:
Prime in december
Click to expand...
Click to collapse
Not for the UK though?

Nice Tegra 3 is very good

Ant38 said:
Nice Tegra 3 is very good
Click to expand...
Click to collapse
why? beacuse of 4 cores?
You won't use the power of them... at the moment Android doesn't use 2 cores properly
2, 4 or idk 16 cores doesn't mean that system will be faster
look at HTC 7 Mozart with WP7.5 - 1 core snapdragon with 512 MB and system is very smooth and fast
I think that producers must focus at optimization of their version of Android
making smartphones with better hardware is not a solution - it only makes more problems

darasz89 said:
why? beacuse of 4 cores?
You won't use the power of them... at the moment Android doesn't use 2 cores properly
2, 4 or idk 16 cores doesn't mean that system will be faster
look at HTC 7 Mozart with WP7.5 - 1 core snapdragon with 512 MB and system is very smooth and fast
I think that producers must focus at optimization of their version of Android
making smartphones with better hardware is not a solution - it only makes more problems
Click to expand...
Click to collapse
Agree with you when you say about Android, which doesn't take full advantage of 4 cores.
But the main feature in the quad-core technology is the battery life saving.
Eventually, in the next year when there will be optimized app for dual and quad- core CPU, this tablet ( or any other 4cores phone) will rock for many, many months.

yukinok25 said:
But the main feature in the quad-core technology is the battery life saving.
Click to expand...
Click to collapse
4 core CPU takes less energy than 1 core? you are wrong
yes its true that power use per 1 core in tegra3 is low but overall its larger that single or dual core

darasz89 said:
4 core CPU takes less energy than 1 core? you are wrong
yes its true that power use per 1 core in tegra3 is low but overall its larger that single or dual core
Click to expand...
Click to collapse
Yes, its' use less power for normal usage.
It's obvious that if you use ALL 4 cores, the drain would be more than a single core CPU:
http://www.techrepublic.com/blog/hi...n-actually-use-less-power-than-dual-core/7976
Qualcomm as well is claiming that his next generation quad core phones will save 65% of battery life compare to the current ARM CPU:
http://www.theinquirer.net/inquirer...debuts-single-dual-quad-core-snapdragon-chips

speculations and promises of manufacturers
the major factor of power usage is system resources managment
at start new 4-core CPU will consume more energy beacuse of lack efficient support for 4 cores - but someday it will be optimized well
but the question is: do we really need 4 cores? i think that we need better optimized sotfware
did I loose my mind? look at devices running WP7.5 and tell me that Android is smoother [I'm not a M$ fanboy ]
now with 2cores we have big computing power and it should be use efficiently

darasz89 said:
speculations and promises of manufacturers
the major factor of power usage is system resources managment
at start new 4-core CPU will consume more energy beacuse of lack efficient support for 4 cores - but someday it will be optimized well
but the question is: do we really need 4 cores? i think that we need better optimized sotfware
did I loose my mind? look at devices running WP7.5 and tell me that Android is smoother [I'm not a M$ fanboy ]
now with 2cores we have big computing power and it should be use efficiently
Click to expand...
Click to collapse
I really hope is not speculation, cause we need battery life saving for our devices.
I am agree about the lack of apps optimized for a 4 cores CPU, however my I7-740Qm use less battery (I repeat on normal usage) than my old AMD Athlon XP, because of the improvement in architecture and technology such as Turbo Boost or Hyperthreading that Intel has implemented.
I believe the new 28nm CPU from Qualcomm and Nvidia will bring similar helpful features.

you're comparing normal CPU's
all the time I was talking about CPU on smartphones
difference between those two CPUs you mentioned is comparable to difference between Ford T and Ferrari Enzo
on smartphones we have limited resources - you cannot easily add f.e. 512 MB of RAM, change CPU or GPU
we have more limited hardware compared to PC
the goal is optimalization of software not adding cores to CPU or more RAM
Edit
Tegra2 and Tegra3 both have 40 nm process
Source

@OP
2011 is not a good year for hunting Tegra 3.
With only one product Asus TF Prime out, it could happen like the Xoom launch.
Although the prime seems like a very nice upgrade, I'm already reading about Lenovo and Acer bringing 1920*1200 res screens.
I know you may have your preference on your Manufacturer, I do too, but it's better to wait a bit to see all the players then choose.
I waited a bit and got a Galaxy Tab 10.1 one of the best tegra 2 tablets. Really interested in what Samsung can bring in 2012.

darasz89 said:
you're comparing normal CPU's
all the time I was talking about CPU on smartphones
difference between those two CPUs you mentioned is comparable to difference between Ford T and Ferrari Enzo
on smartphones we have limited resources - you cannot easily add f.e. 512 MB of RAM, change CPU or GPU
we have more limited hardware compared to PC
the goal is optimalization of software not adding cores to CPU or more RAM
Edit
Tegra2 and Tegra3 both have 40 nm process
Source
Click to expand...
Click to collapse
Yup that was just an example darasz89
Why do you think we have limited resources? Would you expect 2 years ago that one day we would use a smartphone big like 2 packs of cigarettes combined together, with a full OS installed, 4 cores, and 1GB of RAM?
Mobile technology is the future. We will see much more hopefully...
Yup, only Qualcomm for now will use 28nm technlogy.

priority of Android devices should be optimized software before hardware
I don't think that way you see the future of Android is bad, but first we should use all of "given power" of 2 core CPU and after that extend to 4core
funny summary: "With great power comes great responsibility."

Just pre-ordered prime from my local Future Shop. *Anxiously awaiting*

darasz89 said:
priority of Android devices should be optimized software before hardware
I don't think that way you see the future of Android is bad, but first we should use all of "given power" of 2 core CPU and after that extend to 4core
funny summary: "With great power comes great responsibility."
Click to expand...
Click to collapse
Actually that should be the priority of software in general, but wrriting good code is hard enough. Right now, ever since the software crisis, hardware has always been two three steps ahead of software, and right now it is in a way a good thing. You don't want a software to be lagging on your computer because its not powerful enough right? You'd rather have the extra juice to power any software with ease.

Hey guys!! According to bestbuy canada, the expected warehouse delivery date is December 5, 2011. Not going to check elsewhere, but just an FYI to all who asked.
Link

Related

Tegra 3 = Beast!

Wow, just reading this and watching the video got me really excited!
Quote: "...its benchmark puts Kal-El at a higher performance bracket than even Intel's Core 2 Duo full-on-PC processors."
Enjoy: http://pocketnow.com/android/nvidia-quad-core-kal-el-in-android-devices-this-summer
I guess my next phone will somewhere on par with my [email protected], nah not quite but still impressive.
Its freakin ridiculous isn't it, I can't imagine how powerful wayne, logan, or even stark will be.
By the way, those are the architectures coming after Kal-El as seen in the roadmap here
http://www.anandtech.com/show/4181/...-a9s-coming-to-smartphonestablets-this-year/1
Can't wait for my Q6600 to have a little brother as well.
dreadlord369 said:
Its freakin ridiculous isn't it, I can't imagine how powerful wayne, logan, or even stark will be.
By the way, those are the architectures coming after Kal-El as seen in the roadmap here
http://www.anandtech.com/show/4181/...-a9s-coming-to-smartphonestablets-this-year/1
Click to expand...
Click to collapse
Wow sick! I had a feeling the technology was gona explode once dual core starts being implemented into phones but this is just ridiculous. I wander which C2D they are comparing to though. Can't wait to play some Crysis on my phone !!
omg it looks so cool!
7
Its lie, arm can not beat intel dual core cpus for next three year
It might be better then atom dual...
Sent from my LG-SU660 using XDA App
uhh three years is too long if they havent already beat some dual core chips, least thats what i think...specially since the kal-el and omap 5 cpus and whatever qualcomm have planned are gunna be freaking awesome!
OMG!!!! Its amazing
Mobile phones better than my first PC
Well since nvidia is supposedly releasing quad core in q4 of this year I say that computers will prolly eventually die out. Especially since this year smartphone sales beat computers...just a thought
HTC HD2 w/ 2.3 : )
CTR01 said:
Well since nvidia is supposedly releasing quad core in q4 of this year I say that computers will prolly eventually die out. Especially since this year smartphone sales beat computers...just a thought
HTC HD2 w/ 2.3 : )
Click to expand...
Click to collapse
Funny you mention that, I was just in uni talking about networking (my major) and technology and a classmate said the same thing. I would say it could happen in maybe 20+ years.
I would like to see a Tegra 3 rendering a complex 3D scene or something like that which would really show it's performance.
Is this the Q6600 club or what? <3
Sent from my HTC Vision using Tapatalk
I have an Athlon X3 435 at 3.6 ghz. Can go up to 3.8 ghz as well. But too much v-core.
Although they are saying these newer processors are supposed to be much more efficient, are these dual and quad core processors going to be a viable option with the today's battery technology?
Or is it going to be more of a "use it if it is available" for the app devs, and therefore negating any positive improvements in battery life?
icecold23 said:
Although they are saying these newer processors are supposed to be much more efficient, are these dual and quad core processors going to be a viable option with the today's battery technology?
Or is it going to be more of a "use it if it is available" for the app devs, and therefore negating any positive improvements in battery life?
Click to expand...
Click to collapse
Yeah, I don' think the battery technology is on par or evolving on par with the processors. At this rate, we'll have "stationary" tablets with the current battery technology.
icecold23 said:
Although they are saying these newer processors are supposed to be much more efficient, are these dual and quad core processors going to be a viable option with the today's battery technology?
Or is it going to be more of a "use it if it is available" for the app devs, and therefore negating any positive improvements in battery life?
Click to expand...
Click to collapse
It just really depends on how optimized these cores are for power. It's not adding cores that gives a higher TDP, it's the vcore of the core and the frequency the cores run at. But really, I can't see cpus going any other way but multicore or multithread. It's more efficient for power and performance to have 2 cores running at 1 ghz each, instead of having a cpu at 2ghz that will have a higher tdp and vcore to keep it stable. If the cores are a smaller die size, then it works out perfectly.
vbetts said:
It just really depends on how optimized these cores are for power. It's not adding cores that gives a higher TDP, it's the vcore of the core and the frequency the cores run at. But really, I can't see cpus going any other way but multicore or multithread. It's more efficient for power and performance to have 2 cores running at 1 ghz each, instead of having a cpu at 2ghz that will have a higher tdp and vcore to keep it stable. If the cores are a smaller die size, then it works out perfectly.
Click to expand...
Click to collapse
agreed...while im no cpu expert i do know the slight basics and did a little reading that agrees with vbetts. they said the 4 core kal-el nvidia cpu is supposed to have ~12 hours of play back for hd video...least thats what someone on a thread of mine posted...
vbetts said:
It just really depends on how optimized these cores are for power. It's not adding cores that gives a higher TDP, it's the vcore of the core and the frequency the cores run at. But really, I can't see cpus going any other way but multicore or multithread. It's more efficient for power and performance to have 2 cores running at 1 ghz each, instead of having a cpu at 2ghz that will have a higher tdp and vcore to keep it stable. If the cores are a smaller die size, then it works out perfectly.
Click to expand...
Click to collapse
Yeah exactly, I personally thought the move to dual core would be sooner with a 2 500mhz core cpu or lower since that would still be better then a single 1Ghz chip.
Battery is definitely a issue, with today's technology I wonder how much these chips will consume at 100% load or when playing a game which use most of the devices grunt. On my DHD I can take it up to 1.9Ghz stable and if i'm playing FPse while at that frequency current widget show consumption of around 425mA, while at 1Ghz it's around 285mA. That's quiet a difference! So in order for these chips to be efficient they shouldn't use much more battery then todays chips.
I love my Q6600, max OC I could get was 4Ghz but it required 1.6v Vcore and on air that was HOT. Still I made it into Windows and did some benching, 13.110s on a 1MB SuperPI/1.5 XS mod
CTR01 said:
agreed...while im no cpu expert i do know the slight basics and did a little reading that agrees with vbetts. they said the 4 core kal-el nvidia cpu is supposed to have ~12 hours of play back for hd video...least thats what someone on a thread of mine posted...
Click to expand...
Click to collapse
Yeah I read that as well or heard it somewhere.
Yeah exactly, I personally thought the move to dual core would be sooner with a 2 500mhz core cpu or lower since that would still be better then a single 1Ghz chip.
Click to expand...
Click to collapse
If the app is multithreaded capable, then yes. Easily better. But 2 500mhz cpus would probably be best for multitasking.
Battery is definitely a issue, with today's technology I wonder how much these chips will consume at 100% load or when playing a game which use most of the devices grunt. On my DHD I can take it up to 1.9Ghz stable and if i'm playing FPse while at that frequency current widget show consumption of around 425mA, while at 1Ghz it's around 285mA. That's quiet a difference! So in order for these chips to be efficient they shouldn't use much more battery then todays chips.
Click to expand...
Click to collapse
Battery has always been an issue though, even on my old Moment the battery sucked. But that's what you get with these I guess. But man, 1.9ghz stable from 1ghz! For a small platform, that's pretty damn impressive.
I love my Q6600, max OC I could get was 4Ghz but it required 1.6v Vcore and on air that was HOT. Still I made it into Windows and did some benching, 13.110s on a 1MB SuperPI/1.5 XS mod
Click to expand...
Click to collapse
Ouch, how long did the chip last? I've got my 435 at 1.52vcore. I can go higher but I need this chip to last me for a year or so.
I can't believe that they set the time-frame for the release as early as they did. Hopefully they will live up to this standard

TEGRA 4 - 1st possible GLBenchmark!!!!!!!! - READ ON

Who has been excited by the Tegra 4 rumours?, last night's Nvidia CES announcement was good, but what we really want are cold-hard BENCHMARKS.
I found an interesting mention of Tegra T114 SoC on a Linux Kernel site, which I've never heard of. I got really interested when it stated that the SoC is based on ARM A15 MP, it must be Tegra 4. I checked the background of the person who posted the kernel patch, he is a senior Nvidia Kernel engineer based in Finland.
https://lkml.org/lkml/2012/12/20/99
"This patchset adds initial support for the NVIDIA's new Tegra 114
SoC (T114) based on the ARM Cortex-A15 MP. It has the minimal support
to allow the kernel to boot up into shell console. This can be used as
a basis for adding other device drivers for this SoC. Currently there
are 2 evaluation boards available, "Dalmore" and "Pluto"."
On the off chance I decided to search www.glbenchmark.com for the 2 board names, Dalmore (a tasty whisky!) and Pluto (Planet, Greek God and cartoon dog!) Pluto returned nothing, but Dalmore returned a device called 'Dalmore Dalmore' that was posted on 3rd January 2013. However the OP had already deleted them, but thanks to Google Cache I found the results
RESULTS
GL_VENDOR NVIDIA Corporation
GL_VERSION OpenGL ES 2.0 17.01235
GL_RENDERER NVIDIA Tegra
From System spec, It runs Android 4.2.1, a Min frequency of 51 MHz and Max of 1836 Mhz
Nvidia DALMORE
GLBenchmark 2.5 Egypt HD C24Z16 - Offscreen (1080p) : 32.6 fps
iPad 4
GLBenchmark 2.5 Egypt HD C24Z16 - Offscreen (1080p): 49.6 fps
CONCLUSION
Anandtech has posted that Tegra 4 doesn't use unified shaders, so it's not based on Kepler. I reckon that if Nvidia had a brand new GPU they would have shouted about it at CES, the results I've found indicate that Tegra 4 is between 1 to 3 times faster than Tegra 3.
BUT, this is not 100% guaranteed to be a Tegra 4 system, but the evidence is strong that it is a T4 development board. If this is correct, we have to figure that it is running beta drivers, Nexus 10 is ~ 10% faster than the Arndale dev board with the same Exynos 5250 SoC. Even if Tegra 4 gets better drivers, it seems like the SGX 544 MP4 in the A6X is still the faster GPU, with Tegra 4 and Mali T604 being an almost equal 2nd. Nvidia has said that T4 is faster than A6X, but the devil is in the detail, in CPU benchmarks I can see that being true, but not for graphics.
UPDATE - Just to add to the feeling that that this legit, the GLBenchmark - System section lists the "android.os.Build.USER" as buildbrain. Buildbrain according to a Nvidia job posting is "Buildbrain is a mission-critical, multi-tier distributed computing system that performs mobile builds and automated tests each day, enabling NVIDIA's high performance development teams across the globe to develop and deliver NVIDIA's mobile product line"
http://jobsearch.naukri.com/job-lis...INEER-Nvidia-Corporation--2-to-4-130812500024
I posted the webcache links to GLBenchmark pages below, if they disappear from cache, I've saved a copy of the webpages, which I can upload, Enjoy
GL BENCHMARK - High Level
http://webcache.googleusercontent.c...p?D=Dalmore+Dalmore+&cd=1&hl=en&ct=clnk&gl=uk
GL BENCHMARK - Low Level
http://webcache.googleusercontent.c...e&testgroup=lowlevel&cd=1&hl=en&ct=clnk&gl=uk
GL BENCHMARK - GL CONFIG
http://webcache.googleusercontent.c...Dalmore&testgroup=gl&cd=1&hl=en&ct=clnk&gl=uk
GL BENCHMARK - EGL CONFIG
http://webcache.googleusercontent.c...almore&testgroup=egl&cd=1&hl=en&ct=clnk&gl=uk
GL BENCHMARK - SYSTEM
http://webcache.googleusercontent.c...ore&testgroup=system&cd=1&hl=en&ct=clnk&gl=uk
OFFSCREEN RESULTS
http://webcache.googleusercontent.c...enchmark.com+dalmore&cd=4&hl=en&ct=clnk&gl=uk
http://www.anandtech.com/show/6550/...00-5th-core-is-a15-28nm-hpm-ue-category-3-lte
Is there any Gpu that could outperform iPad4 before iPad5 comes out? adreno 320, t Mali 604 now tegra 4 aren't near it. Qualcomm won't release anything till q4 I guess, and tegra 4 has released too only thing that is left is I guess is t Mali 658 coming with exynos 5450 (doubtfully when it would release, not sure it will be better )
Looks like apple will hold the crown in future too .
i9100g user said:
Is there any Gpu that could outperform iPad4 before iPad5 comes out? adreno 320, t Mali 604 now tegra 4 aren't near it. Qualcomm won't release anything till q4 I guess, and tegra 4 has released too only thing that is left is I guess is t Mali 658 coming with exynos 5450 (doubtfully when it would release, not sure it will be better )
Looks like apple will hold the crown in future too .
Click to expand...
Click to collapse
There was a great article on Anandtech that tested the power consumption of the Nexus 10's Exynos 5250 SoC, it showed that both the CPU and GPU had a TDP of 4W, making a theoretical SoC TDP of 8W. However when the GPU was being stressed by running a game, they ran a CPU benchmark in the background, the SoC quickly went up to 8W, but the CPU was quickly throttled from 1.7 GHz to just 800 Mhz as the system tried to keep everything at 4W or below, this explained why the Nexus 10 didn't benchmark as well as we wished.
Back to the 5450 which should beat the A6X, trouble is it has double the CPU & GPU cores of the 5250 and is clocked higher, even on a more advanced 28nm process, which will lower power consumption I feel that system will often be throttled because of power and maybe heat concerns, so it looks amazing on paper but may disappoint in reality, and a 5450 in smartphone is going to suffer even more.
So why does Apple have an advantage?, well basically money, for a start mapple fans will pay more for their devices, so they afford to design a big SoC and big batteries that may not be profitable to other companies. Tegra 4 is listed as a 80mm2 chip, iPhone 5 is 96mm2 and A6X is 123mm2, Apple can pack more transistors and reap the GPU performance lead, also they chosen graphics supplier Imagination Technologies have excellent products, Power VR Rogue will only increase Apple's GPU lead. They now have their own chip design team, the benefit for them has been their Swift core is almost as powerful as ARM A15, but seems less power hungry, anyway Apple seems to be happy running slower CPUs compared to Android. Until an Android or WP8 or somebody can achieve Apple's margins they will be able to 'buy' their way to GPU domination, as an Android fan it makes me sad:crying:
32fps is no go...lets hope it's not final
hamdir said:
32fps is no go...lets hope it's not final
Click to expand...
Click to collapse
It needs to, but it will be OK for a new Nexus 7
still faster enough for me, I dont game alot on my nexus 7.
I know I'm taking about phones here ... But the iPhone 5 GPU and adreno 320 are very closely matched
Sent from my Nexus 4 using Tapatalk 2
italia0101 said:
I know I'm taking about phones here ... But the iPhone 5 GPU and adreno 320 are very closely matched
Sent from my Nexus 4 using Tapatalk 2
Click to expand...
Click to collapse
From what I remember the iPhone 5 and the new iPad wiped the floor with Nexus 4 and 10. The ST-Ericsson Nova A9600 is likely to have a PowerVR Rogue GPU. Just can't wait!!
adityak28 said:
From what I remember the iPhone 5 and the new iPad wiped the floor with Nexus 4 and 10. The ST-Ericsson Nova A9600 is likely to have a PowerVR Rogue GPU. Just can't wait!!
Click to expand...
Click to collapse
That isn't true , check glbenchmark , in the off screen test the iPhone scored 91 , the nexus 4 scored 88 ... That ksnt wiping my floors
Sent from my Nexus 10 using Tapatalk HD
Its interesting how even though nvidia chips arent the best we still get the best game graphics because of superior optimization through tegra zone. Not even the a6x is as fully optimized.
Sent from my SAMSUNG-SGH-I727 using xda premium
ian1 said:
Its interesting how even though nvidia chips arent the best we still get the best game graphics because of superior optimization through tegra zone. Not even the a6x is as fully optimized.
Sent from my SAMSUNG-SGH-I727 using xda premium
Click to expand...
Click to collapse
What sort of 'optimisation' do you mean? un optimised games lag that's a big letdown and tegra effects can also be used on other phones too with chain fire 3d I use it and tegra games work without lag with effects and I don't have a tegra device
With a tegra device I am restricted to optimised games mostly
The graphic performance of NVIDIA SoCs is always disappointed, sadly for the VGA dominanting provider on the world.
The first Tegra2, the GPU is a little bit better than SGX540 of GalaxyS a little bit in benchmark, but lacking NEON support.
The second one Tegra 3, the GPU is nearly the same as the old Mali400Mp4 in GALAXY S2/Original Note.
And now it's better but still nothing special and outperformed soon (Adreno 330 and next-gen Mali)
Strongest PowerVR GPUs are always the best, but sadly they are exclusive for Apple only (SGX543 and maybe SGX 554 also, only Sony ,who has the cross-licencing with Apple, has it in PS Vita and in PS Vita only)
tegra optimization porting no longer works using chainfire, this is now a myth
did u manage to try shadowgun thd, zombie driver or horn? the answer is no, games that use t3 sdk for physx and other cpu graphics works can not be forced to work on other devices, equally chainfire is now outdated and no longer updated
now about power vr they are only better in real multicore configuration which is only used by apple and Sony's vita, eating large die area, ie actual multicore each with its own subcores/shaders, if tegra was used in real multi core it would destroy all
finally this is really funny all this doom n gloom because of an early discarded development board benchmark, I dont mean to take away from turbo's thunder and his find but truly its ridiculous the amount of negativity its is collecting before any type of final device benchs
adrena 220 doubled in performance after the ICS update on sensation
t3 doubled the speed of t2 gpu with only 50% the number of shaders so how on earth do you believe only 2x the t3 scores with 600% more shaders!!
do you have any idea how miserable the ps3 performed in its early days? even new desktop GeForces perform much less than expected until the drivers are updated
enough with the FUD! seems this board is full of it nowadays and so little reasoning...
For goodness sake, this isn't final hardware, anything could change. Hung2900 knows nothing, what he stated isn't true. Samsung has licensed PowerVR, it isn't just stuck to Apple, just that Samsung prefers using ARMs GPU solution. Another thing I dislike is how everyone is comparing a GPU in the iPad 4 (SGX554MP4) that will NEVER arrive in a phone compared a Tegra 4 which will arrive in a phone. If you check OP link the benchmark was posted on the 3rd of January with different results (18fps then 33fps), so there is a chance it'll rival the iPad 4. I love Tegra as Nvidia is pushing developers to make more better games for Android compared to the 'geeks' *cough* who prefers benchmark results, whats the point of having a powerful GPU if the OEM isn't pushing developers to create enhance effect games for there chip.
Hamdir is correct about the GPUs, if Tegra 3 was around 50-80% faster than Tegra 2 with just 4 more cores, I can't really imagine it only being 2x faster than Tegra 3. Plus its a 28nm (at around 80mm2 just a bit bigger than Tegra 3, smaller than A6 90mm2) along with the dual memory than single on Tegra 2/3.
Turbotab said:
There was a great article on Anandtech that tested the power consumption of the Nexus 10's Exynos 5250 SoC, it showed that both the CPU and GPU had a TDP of 4W, making a theoretical SoC TDP of 8W. However when the GPU was being stressed by running a game, they ran a CPU benchmark in the background, the SoC quickly went up to 8W, but the CPU was quickly throttled from 1.7 GHz to just 800 Mhz as the system tried to keep everything at 4W or below, this explained why the Nexus 10 didn't benchmark as well as we wished.
Back to the 5450 which should beat the A6X, trouble is it has double the CPU & GPU cores of the 5250 and is clocked higher, even on a more advanced 28nm process, which will lower power consumption I feel that system will often be throttled because of power and maybe heat concerns, so it looks amazing on paper but may disappoint in reality, and a 5450 in smartphone is going to suffer even more.
So why does Apple have an advantage?, well basically money, for a start iSheep will pay more for their devices, so they afford to design a big SoC and big batteries that may not be profitable to other companies. Tegra 4 is listed as a 80mm2 chip, iPhone 5 is 96mm2 and A6X is 123mm2, Apple can pack more transistors and reap the GPU performance lead, also they chosen graphics supplier Imagination Technologies have excellent products, Power VR Rogue will only increase Apple's GPU lead. They now have their own chip design team, the benefit for them has been their Swift core is almost as powerful as ARM A15, but seems less power hungry, anyway Apple seems to be happy running slower CPUs compared to Android. Until an Android or WP8 or somebody can achieve Apple's margins they will be able to 'buy' their way to GPU domination, as an Android fan it makes me sad:crying:
Click to expand...
Click to collapse
Well said mate!
I can understand what you feel, nowdays android players like samsung,nvidia are focusing more on CPU than GPU.
If they won't stop soon and continued to use this strategy they will fail.
GPU will become bottleneck and you will not be able use the cpu at its full potential. (Atleast when gaming)
i have Galaxy S2 exynos 4 1.2Ghz and 400mhz oc mali gpu
In my analysis most modern games like MC4,NFS:MW aren't running at 60FPS at all thats because GPU always have 100% workload and CPU is relaxing there by outputing 50-70% of total CPU workload
I know some games aren't optimize for all android devices as opposed to apple devices but still even high-end android devices has slower gpu (than ipad 4 atleast )
AFAIK, Galaxy SIV is likely to pack T-604 with some tweaks instead of mighty T-658 which is still slower than iPAddle 4
Turbotab said:
There was a great article on Anandtech that tested the power consumption of the Nexus 10's Exynos 5250 SoC, it showed that both the CPU and GPU had a TDP of 4W, making a theoretical SoC TDP of 8W. However when the GPU was being stressed by running a game, they ran a CPU benchmark in the background, the SoC quickly went up to 8W, but the CPU was quickly throttled from 1.7 GHz to just 800 Mhz as the system tried to keep everything at 4W or below, this explained why the Nexus 10 didn't benchmark as well as we wished.
Back to the 5450 which should beat the A6X, trouble is it has double the CPU & GPU cores of the 5250 and is clocked higher, even on a more advanced 28nm process, which will lower power consumption I feel that system will often be throttled because of power and maybe heat concerns, so it looks amazing on paper but may disappoint in reality, and a 5450 in smartphone is going to suffer even more.
So why does Apple have an advantage?, well basically money, for a start iSheep will pay more for their devices, so they afford to design a big SoC and big batteries that may not be profitable to other companies. Tegra 4 is listed as a 80mm2 chip, iPhone 5 is 96mm2 and A6X is 123mm2, Apple can pack more transistors and reap the GPU performance lead, also they chosen graphics supplier Imagination Technologies have excellent products, Power VR Rogue will only increase Apple's GPU lead. They now have their own chip design team, the benefit for them has been their Swift core is almost as powerful as ARM A15, but seems less power hungry, anyway Apple seems to be happy running slower CPUs compared to Android. Until an Android or WP8 or somebody can achieve Apple's margins they will be able to 'buy' their way to GPU domination, as an Android fan it makes me sad:crying:
Click to expand...
Click to collapse
Typical "isheep" reference, unnecessary.
Why does apple have the advantage? Maybe because there semiconductor team is talented and can tie the A6X+PowerVR GPU efficiently. NIVIDA should have focused more on GPU in my opinion as the CPU was already good enough. With these tablets pushing excess of 250+ppi the graphics processor will play a huge role. They put 72 cores in there processor. Excellent. Will the chip ever be optimized to full potential? No. So again they demonstrated a product that sounds good on paper but real world performance might be a different story.
MrPhilo said:
For goodness sake, this isn't final hardware, anything could change. Hung2900 knows nothing, what he stated isn't true. Samsung has licensed PowerVR, it isn't just stuck to Apple, just that Samsung prefers using ARMs GPU solution. Another thing I dislike is how everyone is comparing a GPU in the iPad 4 (SGX554MP4) that will NEVER arrive in a phone compared a Tegra 4 which will arrive in a phone. If you check OP link the benchmark was posted on the 3rd of January with different results (18fps then 33fps), so there is a chance it'll rival the iPad 4. I love Tegra as Nvidia is pushing developers to make more better games for Android compared to the 'geeks' *cough* who prefers benchmark results, whats the point of having a powerful GPU if the OEM isn't pushing developers to create enhance effect games for there chip.
Hamdir is correct about the GPUs, if Tegra 3 was around 50-80% faster than Tegra 2 with just 4 more cores, I can't really imagine it only being 2x faster than Tegra 3. Plus its a 28nm (at around 80mm2 just a bit bigger than Tegra 3, smaller than A6 90mm2) along with the dual memory than single on Tegra 2/3.
Click to expand...
Click to collapse
Firstly please keep it civil, don't go around saying that people know nothing, people's posts always speak volumes. Also calling people geeks, on XDA is that even an insult, next you're be asking what I deadlift:laugh:
My OP was done in the spirit of technical curiosity, and to counter the typical unrealistic expectations of a new product on mainstream sites, e.g. Nvidia will use Kepler tech (which was false), omg Kepler is like GTX 680, Tegra 4 will own the world, people forget that we are still talking about device that can only use a few watts, and must be passively cooled and not a 200+ watt, dual-fan GPU, even though they both now have to power similar resolutions, which is mental.
I both agree and disagree with your view on Nvidia's developer relationship, THD games do look nice, I compared Infinity Blade 2 on iOS vs Dead Trigger 2 on youtube, and Dead Trigger 2 just looked richer, more particle & physics effects, although IF Blade looked sharper at iPad 4 native resolution, one of the few titles to use the A6x's GPU fully.The downside to this relationship is the further fragmentation of the Android ecosystem, as Chainfire's app showed most of the extra effects can run on non Tegra devices.
Now, a 6 times increase in shader, does not automatically mean that games / benchmarks will scale in linear fashion, as other factors such as TMU /ROP throughput can bottleneck performance. Nvidia's Technical Marketing Manager, when interviewed at CES, said that the overall improvement in games / benchmarks will be around 3 to 4 times T3. Ultimately I hope to see Tegra 4 in a new Nexus 7, and if these benchmarks are proved accurate, it wouldn't stop me buying. Overall including the CPU, it would be a massive upgrade over the current N7, all in the space of a year.
At 50 seconds onwards.
https://www.youtube.com/watch?v=iC7A5AmTPi0
iOSecure said:
Typical "isheep" reference, unnecessary.
Why does apple have the advantage? Maybe because there semiconductor team is talented and can tie the A6X+PowerVR GPU efficiently. NIVIDA should have focused more on GPU in my opinion as the CPU was already good enough. With these tablets pushing excess of 250+ppi the graphics processor will play a huge role. They put 72 cores in there processor. Excellent. Will the chip ever be optimized to full potential? No. So again they demonstrated a product that sounds good on paper but real world performance might be a different story.
Click to expand...
Click to collapse
Sorry Steve, this is an Android forum, or where you too busy buffing the scratches out of your iPhone 5 to notice? I have full respect for the talents of Apple's engineers & marketing department, many of its users less so.
hamdir said:
tegra optimization porting no longer works using chainfire, this is now a myth
did u manage to try shadowgun thd, zombie driver or horn? the answer is no, games that use t3 sdk for physx and other cpu graphics works can not be forced to work on other devices, equally chainfire is now outdated and no longer updated
Click to expand...
Click to collapse
Looks like they haven't updated chain fire 3d for a while as a result only t3 games don't work but others do work rip tide gp, dead trigger etc . It's not a myth but it is outdated and only works with ics and tegra 2 compatible games . I think I (might be) unfortunate too but some gameloft games lagged on tegra device that i had, though root solved it too an extent
I am not saying something is superior to something just that my personal experience I might be wrong I may not be
Tbh I think benchmarks don't matter much unless you see some difference in real world usage and I had that problem with tegra in my experience
But we will have to see if the final version is able to push it above Mali t 604 and more importantly sgx544
Turbotab said:
Sorry Steve, this is an Android forum, or where you too busy buffing the scratches out of your iPhone 5 to notice? I have full respect for the talents of Apple's engineers & marketing department, many of its users less so.
Click to expand...
Click to collapse
No I actually own a nexus 4 and ipad mini so I'm pretty neutral in googles/apples ecosystems and not wiping any scratches off my devices.

8-cores really needed?

8 cores in a mobile device is clearly a first and Samsung were likely to be the company to bring it to us BUT what's the point?
Is it just me who sees that as excessive?
There's not a chance that we'll even TRULY feel the power of the extra 4 cores. 4 cores aren't generally needed in other phones as there's only a small handful of games which actually use all the power. Even in those games, the most you're gonna have is a very sought amount of extra power.
Technology is still a few years away from needing 8 cores in a phone so really there's no point as the end user won't feel a difference in the slightest.
I'll admit it's a clever gimmick to sell more but, in my opinion, that's all it is.
What do others think?
This has been discussed to death, the point is for power efficiency without losing peak performance. That's the only and single reason this exists. You will have excellent battery life in every-day tasks due to it.
samsung people are mainly concentrating on specifications.. in order to match with xperia z waterproof and htc one 's features , the only thing they can do is sticking with ultra high spec..it is lookwise galaxy note 2 only..no doubt..so may be spec will be their primary concern...
The S3 does a more than good enough job in that regard though and there's only gonna be a certain amount of space for improvement. A whole 4 cores extra will leave purge majority of its power wasted. 6 cores, fair enough (to an extent) but not that many.
Samsung always tries to set a new standard regarding specs.
So they try to push other mobile facturers to do the same.
Also I think they use an 8 core so the S4 doesnt outdate that quickly.
Plus that it can actually be batterysaving tot have an 8 core over a 4 core CPU.
But this is just my opinion .
KidCarter93 said:
The S3 does a more than good enough job in that regard though and there's only gonna be a certain amount of space for improvement. A whole 4 cores extra will leave purge majority of its power wasted. 6 cores, fair enough (to an extent) but not that many.
Click to expand...
Click to collapse
u can't compare A9 cores with A15 cores
A15 slightly use more power ( thats the biggest problem on A15 cores )
It is not a "real" 8-core CPU. It consists of two quad cores that alternate. One is a high end CPU that blows away previous hardware. The other is a low powered but efficient CPU that literally sips battery life. The phone alternates between the two based on the task at hand. Basically, you will never be using more than 4-cores and that is by design.
Kresge said:
It is not a "real" 8-core CPU. It consists of two quad cores that alternate. One is a high end CPU that blows away previous hardware. The other is a low powered but efficient CPU that literally sips battery life. The phone alternates between the two based on the task at hand. Basically, you will never be using more than 4-cores and that is by design.
Click to expand...
Click to collapse
ThankKresgege - that's good to know. I wasn't sure exactly how they were going to be utilizing the 8 cores, but what you just said makes complete sense. :good:
jaykresge said:
It is not a "real" 8-core CPU. It consists of two quad cores that alternate. One is a high end CPU that blows away previous hardware. The other is a low powered but efficient CPU that literally sips battery life. The phone alternates between the two based on the task at hand. Basically, you will never be using more than 4-cores and that is by design.
Click to expand...
Click to collapse
So, you will use the powersufficient cores most of the time (which is actually more batteryfriendly than the CPU in S3), and use the superultra high-end CPU when you need some incredible amount of power?
Could 8 cores finally push the ppsspp(PSP) emulator to work better? I know the app is still beta but I tried it on Tablet with Quad core Tegra 3 clocked at 1.4ghz & the frames froze so much the game was unplayable. If anyone has tried God of War in the emulator you know what I mean.lol.
Sent from my R800i using xda premium
cyrusalmighty said:
Could 8 cores finally push the ppsspp(PSP) emulator to work better? I know the app is still beta but I tried it on Tablet with Quad core Tegra 3 clocked at 1.4ghz & the frames froze so much the game was unplayable. If anyone has tried God of War in the emulator you know what I mean.lol.
Sent from my R800i using xda premium
Click to expand...
Click to collapse
Why don't you get that the phone is not 8 cores. People need to stop with this.its like having two phones in one.one for browsing and low Cpu usage the other one is for heavy gaming etc etc. Its not 8 cores running at the same time
Sent from my GT-I9300 using xda premium
lol, so many people in here just refuse to read.
@OP The 8 cores are not meant for power they are meant to save it the power hungry a15s will drain more battery whereas the a7s use considerably less the a15s will only be used when necessary meaning u will be saving alot battery
I have a feeling were gonna be addressing this alot
PHONE SLOW CLICK ME?
1 days 2 s4
---------- Post added at 10:35 AM ---------- Previous post was at 10:35 AM ----------
See the part on arm big.LITTLE
http://forum.xda-developers.com/showthread.php?t=2191690
PHONE SLOW CLICK ME?
1 days 2 s4​
TingTingin said:
@OP The 8 cores are not meant for power they are meant to save it the power hungry a15s will drain more battery whereas the a7s use considerably less the a15s will only be used when necessary meaning u will be saving alot battery
I have a feeling were gonna be addressing this alot
PHONE SLOW CLICK ME?
1 days 2 s4
---------- Post added at 10:35 AM ---------- Previous post was at 10:35 AM ----------
See the part on arm big.LITTLE
http://forum.xda-developers.com/showthread.php?t=2191690
PHONE SLOW CLICK ME?
1 days 2 s4​
Click to expand...
Click to collapse
Indeed you are. It's basically like having an MTK MT6589 chipset for low-demanding applications and a Snapdragon 600 for high-demanding applications on one die.
No doubt they're going to make this a selling point though.​
even quad cores for a phone is insane overkill lol, but for the latest demanding 3d games its sure needed.
I did a write-up on this so we can hopefully avoid these discussions for the rest of the year.
mortuus82 said:
even quad cores for a phone is insane overkill lol, but for the latest demanding 3d games its sure needed.
Click to expand...
Click to collapse
Read through the thread
PHONE SLOW CLICK ME?
1 days 2 s4​
It is not an octa-core phone by any means if you call the Tegra 3 quad-core. If Samsung dares to claim this thing (4+4) octa-core, the Tegra 3/4 (4+1)is penta-core.
The S4 (international) have 4 A7 cores clocked at (1.2GHz?) and another 4 A15 cores @1.6GHz. The two sets of cores can never run at the same time. This is the big.little idea - A7 cores are more efficient and are used for light tasks, the A15 cores are power hungry but have more horsepower. So it is designed to save energy when compared to only 4 A15 cores.
However whether this or Snapdragon's Krait save more energy is still unknown as of now. Benchmarks have shown the S4 having a very similar performance with the HTC One (w/ Snapdragon 600).
For me, Samsung took this approach due to two benefits: power saving (as said above) and a marketing gimmick. Uninformed people may likely scream "Wow, 8-cores on a phone? That must be powerful!" after watching their ads, so the S4 may sell more.
jaykresge said:
Basically, you will never be using more than 4-cores and that is by design.
Click to expand...
Click to collapse
That is the type of processing they have chosen.
But with custom kernels which have hmp support you will be able to use any number of (upto8) in any order
sent from an Galaxy s3 GT I9300
Running perseus kernel 33.1 , XELLA 4.1.2 leaked build
forum.xda-developers.com/showthread.php?t=1784401
Dont click,you might regret , I won't be responsible if you brick ur head
i9100g user said:
That is the type of processing they have chosen.
But with custom kernels which have hmp support you will be able to use any number of (upto8) in any order
sent from an Galaxy s3 GT I9300
Running perseus kernel 33.1 , XELLA 4.1.2 leaked build
forum.xda-developers.com/showthread.php?t=1784401
Dont click,you might regret , I won't be responsible if you brick ur head
Click to expand...
Click to collapse
I was thinking that
PHONE SLOW CLICK ME?
1 days 2 s4​

[NEWS]Samsung announced the Heterogeneous Multi-Processing (HMP) for Exynos 5 Octa

http://www.samsung.com/global/busin...eterogeneous_Multi_Processing_Capability.html
This is breaking good news for us, though it's still unclear if it's for 5420 only or can be implemented into 5410 through update as well.
If you've been paying attention, you know that the Exynos 5 Octa packs a serious punch when it comes to processing power and energy efficiency. Now, the team at Samsung has made the Exynos 5 Octa even better with the introduction of a new Heterogeneous Multi-Processing (HMP) solution.
Overview of big.LITTLE Technology
Before we jump into the benefits of HMP, let's take a step back and go over the basics ofARM® big.LITTLE technology. In the Exynos 5 Octa, eight CPU cores are responsible for everything from browsing the web to playing your favorite game on your 5 Octa-powered mobile device. Four "big" 1.8GHz ARM® Cortex™-A15 cores handle intensive tasks like graphically rich gaming or HD video playback. Less intensive tasks like e-mail or text functions are tackled by four "LITTLE" 1.3GHz Cortex™-A7 cores. By dividing and conquering tasks and assigning them to the proper CPU cores, big.LITTLE technology maximizes performance while minimizing power loss.
HMP Makes big.LITTLE Technology Even Better
Now this is where HMP comes into play. Like a sports team, big.LITTLE technology relies on a software "coach" to call the plays and assign tasks to each core. In a basic implementation of big.LITTLE technology, this "coach" would alternate between using "big" and "LITTLE" CPU cores based on the computational intensity of any given task, and one core or cluster of cores would remain inactive while its counterpart was engaged.
With HMP, all eight of the CPU cores in the Exynos 5 Octa can be utilized at the same time. This provides users with an unlimited mobile experience in the current mobile environment and also paves the way for more advanced and complex functionality in the future. HMP is extremely versatile. Using a global load balancing scheduler, HMP can assign a single core to handle a task with low computational intensity in order to maximize power efficiency. On the flipside, HMP can also simultaneously utilize each of the eight individual cores in the 5 Octa to run multiple tasks in real time. The global load balancing scheduler pays attention to user workloads and will pull in the necessary available resources for the system to run flexibly and efficiently. By analyzing and assigning tasks,this highly complex software system maximizes efficiency by balancing CPU workload.
The result is the most advanced use of big.LITTLE technology to date and a huge leap forward for multi-processing capability in mobile devices. By allowing for the simultaneous operation of both "big" and "LITTLE" cores in the Exynos 5 Octa, Samsung offers an optimized HMP solution to the balancing act of maximizing mobile device capability while minimizing power loss.
Samsung has always been a leader in big.LITTLE technology, and this new Octa-core HMP solution is an industry first. HMP sets the stage for the future as mobile devices are increasingly called upon to handle complex and graphically rich tasks. Through this innovative solution, the benefits of big.LITTLE technology are maximized to their full potential. Get ready, because the future of mobile processing is evolving, and the Exynos 5 Octa with HMP is leading the way.
Click to expand...
Click to collapse
well its definately about the 5420 because 1.8ghz a15 and 1.3 ghz a7 is not 5410 but 5420 sad
hypergamer1231 said:
well its definately about the 5420 because 1.8ghz a15 and 1.3 ghz a7 is not 5410 but 5420 sad
Click to expand...
Click to collapse
You get your point there, however I think that's not valid enough as when Sammy do their public announcements they will (of course) talk about the better things. Everything they talk about is "up to"
Anyway I'm not sure about this. I know @AndreiLux mentioned the core migration is impossible because flawed hardware, but HMP should be another story, I guess mostly it's about the Linux kernel. Let's just wait and see.
Hopefully they will release a refreshed version of the i9500? Probably not but here's hoping.
@AndreiLux sorry if i disturbed u but now just want to clarify few things because as we all know ur the master of this Exynos thingy...
"Samsung will enable the HMP solution for its Exynos 5 Octa in Q4 2013."
source: " http://www.sammobile.com/2013/09/10...ocessing-will-be-integrated-in-exynos-5-octa/ "
is it for i9500 or it is just for the new Note 3 powered by Exynos 5420 ?
gdonanthony said:
@AndreiLuxis it for i9500 or it is just for the new Note 3 powered by Exynos 5420 ?
Click to expand...
Click to collapse
most likely none of them since they said they will start using it in Q4 2013 ...and because having online all 8 cores and being able to use them all simultaneously mean considerable increase in power consumption at full load my best guess is that the first product using this implementation would be a tablet ...
AvelonTs said:
most likely none of them since they said they will start using it in Q4 2013 ...and because having online all 8 cores and being able to use them all simultaneously mean considerable increase in power consumption at full load my best guess is that the first product using this implementation would be a tablet ...
Click to expand...
Click to collapse
exactly.
hypergamer1231 said:
well its definately about the 5420 because 1.8ghz a15 and 1.3 ghz a7 is not 5410 but 5420 sad
Click to expand...
Click to collapse
5420 runs at 1.9GHz&1.3GHz in the Note 3, on the other hand, the 5410 can run up to 1.8GHz&1.3GHz as it does in the Korean variant GS4.
Hmmm...
AvelonTs said:
...and because having online all 8 cores and being able to use them all simultaneously mean considerable increase in power consumption at full load my best guess is that the first product using this implementation would be a tablet ...
Click to expand...
Click to collapse
You don't understand power management.
POLO_i780 said:
5420 runs at 1.9GHz&1.3GHz in the Note 3, on the other hand, the 5410 can run up to 1.8GHz&1.3GHz as it does in the Korean variant GS4.
Hmmm...
Click to expand...
Click to collapse
The Korean version has the same clocks as the international.
And, no, this is HMP story only concerns the 5420 products. The 5410 is dead.
Oh?
Unless SamMobile is mistaken to:
http://www.sammobile.com/2013/04/10...exynos-5-octa-kills-qualcomms-snapdragon-600/
I was always under the impression the Korean Exynos GS4 ran at max 1.8GHz, hence why it was capable of scoring +31,000 in AnTuTu.
AndreiLux said:
You don't understand power management.
The Korean version has the same clocks as the international.
And, no, this is HMP story only concerns the 5420 products. The 5410 is dead.
Click to expand...
Click to collapse
thanks for the clarification
u mean 5410 is dead so there is no light for it ?
im slowly accepting now this very bad news
AndreiLux said:
You don't understand power management.
Click to expand...
Click to collapse
I do not ... but pure logic says having 8 cores working at max freq all together requires more power than having 4 of them ... or any other compination of them ... or that's not possible even with an HMP enabled SoC?
AvelonTs said:
I do not ... but pure logic says having 8 cores working at max freq all together requires more power than having 4 of them ... or any other compination of them ... or that's not possible even with an HMP enabled SoC?
Click to expand...
Click to collapse
Ohh. And why do you think some normal stuff would require anything more than a7 cores.. u understand that most off the apps on playstore would not require even 2 a5 cores at max logically or 4 a15s at half frequency.. so it doesnt matter that all 8 cores are online.. it only depends on how you use it.
For a power user its cherry over cake for others it is a great news for battery life..
Degrated Shadow said:
Ohh. And why do you think some normal stuff would require anything more than a7 cores.. u understand that most off the apps on playstore would not require even 2 a5 cores at max logically or 4 a15s at half frequency.. so it doesnt matter that all 8 cores are online.. it only depends on how you use it.
For a power user its cherry over cake for others it is a great news for battery life..
Click to expand...
Click to collapse
Low multi-threadedness is a myth or a product of stupid programming. For one I'm making an application right now which parallelizes the whole UI loading and can make easily use of of 7+ threads scales pretty much infinitely if it's allowed to.
More cores don't cost any power, only die area. You gain tremendous dynamic range and power efficiency. Of course your theoretical peak power consumption goes up but you can just cap that with power management.
The Moto X is a good example of how a dual-core SoC can be less power efficient than a quad-core.
AndreiLux said:
Low multi-threadedness is a myth or a product of stupid programming. For one I'm making an application right now which parallelizes the whole UI loading and can make easily use of of 7+ threads scales pretty much infinitely if it's allowed to.
More cores don't cost any power, only die area. You gain tremendous dynamic range and power efficiency. Of course your theoretical peak power consumption goes up but you can just cap that with power management.
The Moto X is a good example of how a dual-core SoC can be less power efficient than a quad-core.
Click to expand...
Click to collapse
I understand. But as you quote did I say the stuff wrong or right?
Sorry for dumb question. Not so good at english.
Degrated Shadow said:
I understand. But as you quote did I say the stuff wrong or right?
Sorry for dumb question. Not so good at english.
Click to expand...
Click to collapse
Wrong; see for yourself :
yahyoh said:
Woah
Click to expand...
Click to collapse
AndreiLux said:
Wrong; see for yourself :
Click to expand...
Click to collapse
Okay understood. But honestly. If all the 8 cores are used how would anyone manage power efficiency?
A15s least frequency consumes higher amount of power than peak power consumption of A7s. Is not that so?
edit: I understood. More cores would cost only die area. They wont cost any power unless they are used more. What happens when die area is occupied more?
I dont think this will make a big difference in battery life. I think the display is the real power hungry in every android device. Also the android system itself.
Smart phone maker should focus on making lighter and smaller battery but better power capacity.
Just my opinion...
Sent from my Nokia 3210 LTE
marc_ecko28 said:
Smart phone maker should focus on making lighter and smaller battery but better power capacity.
Click to expand...
Click to collapse
It's better without battery at all with infinite power capacity
Don't limit your wishes!
From Sammobile:
Chinese manufacturer Meizu has some good news. According to Meizu, HMP can be implemented through a software upgrade, and will be made available to the MX3, its current flagship that uses the same Exynos chip as the Galaxy S4. This means that we should see Samsung doing the same for its own devices, and it could be the reason why the Korean manufacturer is only showing off the new eight-core capability without naming a new Exynos chipset.
Click to expand...
Click to collapse
In short, Meizu is saying that HMP is possible in 5410 Octa, as their MX3 uses it. So, is Meizu wrong?
http://www.sammobile.com/2013/09/11...-eight-core-devices-through-software-upgrade/
http://www.unwiredview.com/2013/09/...oftware-upgrade-will-get-to-current-products/

Exynos Note 3 possibly won't get HMP update from Samsung

Bad news for fellow Exynos users.
http://www.phonearena.com/news/Note...t-core-performance-patch-says-Samsung_id47977
It seems that Exynos 5420 is capable of HMP but will get too hot for the chip to handle.
But the Samsung Engineer does mention that they would comment the HMP update only after a complete testing process to ensure trouble free operation.
Note 3 and Galaxy S4 are unlikely to receive the full octa-core power in their Exynos chipset versions, advised a chief technical expert from Samsung's Mobile Solutions department. Recently the company said that it is able to unleash all the eight cores working at once, which can bump performance significantly, compared to the maximum of four cores restriction we have now with Exynos 5 Octa in these handsets.
The thing is, the engineer comments, that even though Samsung can release a software patch that will allow both the quad-core Cortex-A15 set, and the frugal Cortex-A7 cores, to get together for a task, the thermal envelope of these Exynos chips hasn't been cut for the job.
Click to expand...
Click to collapse
i think this is a good move, they can't force this and overheat the phones.
im tempted to buy the exynos here in saudi as they are cheaper than the snapdragon ones even without the 4k part....
system.img said:
Bad news for fellow Exynos users.
http://www.phonearena.com/news/Note...t-core-performance-patch-says-Samsung_id47977
It seems that Exynos 5420 is capable of HMP but will get too hot for the chip to handle.
Click to expand...
Click to collapse
out of 3 aspects of big little(core migration,cluster migration,hmp), core migration is the most important aspect, since the hardware is fixed any custome kernel for exynos note 3 moght have core migration which will surely increase the battery efficiency
bala_gamer said:
out of 3 aspects of big little(core migration,cluster migration,hmp), core migration is the most important aspect, since the hardware is fixed any custome kernel for exynos note 3 moght have core migration which will surely increase the battery efficiency
Click to expand...
Click to collapse
So Exynos 5420 has Core migration instead of cluster migration?
bala_gamer said:
out of 3 aspects of big little(core migration,cluster migration,hmp), core migration is the most important aspect, since the hardware is fixed any custome kernel for exynos note 3 moght have core migration which will surely increase the battery efficiency
Click to expand...
Click to collapse
Nah. HMP is the real thing. Said Andreilux
HMP is extremely useful for power efficiency because you can migrate stupidly faster than DVFS allows you to.
Click to expand...
Click to collapse
Sent from My GT i9300
If its true, then its very dissapointing step from Samsung
My question is, if there is so much problems in that chip then why he is selling these faulty chip...and why in india, asian and african small countries where government dont do anything agains this kind of behaviour...and costmers also ignoring these issue..which not good for future.
1 many problems in s4
2 sim locking
3 knockon
....many more ..and some may be coming soon
And now no HMP update for s4 or note 3 ...seems like samsung has over confident of their market share and profit...
Isnt this is kind of monopoly???
ipsuvedi said:
If its true, then its very dissapointing step from Samsung
My question is, if there is so much problems in that chip then why he is selling these faulty chip...and why in india, asian and african small countries where government dont do anything agains this kind of behaviour...and costmers also ignoring these issue..which not good for future.
1 many problems in s4
2 sim locking
3 knockon
....many more ..and some may be coming soon
And now no HMP update for s4 or note 3 ...seems like samsung has over confident of their market share and profit...
Isnt this is kind of monopoly???
Click to expand...
Click to collapse
Samsung wants to maximise profits by selling phones made with its in-house chip but is not able to integrate LTE in the chip. So, all it can do is to sell the phone with the Exynos chip in non-LTE areas.
But still, I think the Exynos is good too. It's CPU is damn powerful(considering a clock speed of only 1.9 Ghz)
Thermal problems are a poor excuse considering cluster migration is going to be much worse for thermals.
It's also a poor excuse considering HMP is best for power efficiency, meaning over all temperatures should be lower.
Unless they're worried that people will benchmark each thread and burn the chip out, but all they need to do is put thermal throttles and speed throttles with large A15 use.
I don't want to say Samsung are lazy because that's simply a stupid thing to say. Obviously all of this is very difficult and Samsung don't have the right combination and amount of time, talent and money to make it happen.
Core migration is going to be fine, anyway. If they ever bother with that.
Still disappointing anyway :/
Sent from my GT-N7100 using xda premium
SirCanealot said:
Thermal problems are a poor excuse considering cluster migration is going to be much worse for thermals.
It's also a poor excuse considering HMP is best for power efficiency, meaning over all temperatures should be lower.
Unless they're worried that people will benchmark each thread and burn the chip out, but all they need to do is put thermal throttles and speed throttles with large A15 use.
I don't want to say Samsung are lazy because that's simply a stupid thing to say. Obviously all of this is very difficult and Samsung don't have the right combination and amount of time, talent and money to make it happen.
Core migration is going to be fine, anyway. If they ever bother with that.
Still disappointing anyway :/
Sent from my GT-N7100 using xda premium
Click to expand...
Click to collapse
They have the Linaro team to do the work. They have already accomplished it but it has been shown off with only a prototype tablet.
ipsuvedi said:
If its true, then its very dissapointing step from Samsung
My question is, if there is so much problems in that chip then why he is selling these faulty chip...and why in india, asian and african small countries where government dont do anything agains this kind of behaviour...and costmers also ignoring these issue..which not good for future.
1 many problems in s4
2 sim locking
3 knockon
....many more ..and some may be coming soon
And now no HMP update for s4 or note 3 ...seems like samsung has over confident of their market share and profit...
Isnt this is kind of monopoly???
Click to expand...
Click to collapse
Well, someone write a news without any confermation and users say it's good
So.. now.. i'm going to write a news.. santa is real! And all beleave in it..
Too many sheeps in the world
Big little is over any other arch.. it has the best efficiency ever.. and will have it with hmp..
Yup, if you turn on all 8 cores in maxfreq for 10 minutes.. phone burns.. but.. the logic of big little is: use low power cores (A7) for low tasks.. use high power cores (a15) for hugh tasks..
Than.. now you are using the cluater migration, it has 2 "bugs"
1) all cluster switch, so, is alminented the A15 even if it's not necessary, it happes when 1 core goes to A15
2) the switch from 2 clusters blocks computation for little time
The core migration fix the first problem of cluster migration
The hmp fix each 2 "problems"
So.. if cluater migration is good.. core migration is better, and hmp is better and better
Hmp has high overheating? Well, so n3 with cluster will have more overheating.. all with n3 has this issue?
for me.. all of you use too less the brain.. brain is a muscle.. use it!
I do not find thermal envelope explanation for HMP logical. Cluster Migration switches all the cores to A15 even when one thread requires power hence this design is more battery hungry. If Samsung is really worried about crossing thermal envelope, then they can implement something like Intel has done which they call it Turbo Boost. They can effectively reduce max clock speed to 1.5 GHz when all A15 are running but allow it to run to 1.9 GHz when only 1 or two threads are running.
If Samsung refuses to do so, I hope developers find ways to unlock HMP. It is not that I need 8 cores running simultaneously when my laptop hums at 1.3GHz in dual core mode, but when Samsung teases us and there is treasure hidden ready for unlocking, then it is just human nature to want MORE.
voice of my heart
Absolutely right brother. The die hard snapdragon fans can not digest the Exynos big Little processing and just throwing out rumors and I am really shocked how people believe it. I saw in many discussions that readers and mostly writers were not even software or hardware literate they were just speaking and forwarding the rumors. Actually due to lack of sdk from Samsung for exynos the third party custom rom writers can not do much in exynos as they were able to do in snapdragon so this makes them angry and they spread rumors. I have note 3 SM900 exynos and previously I had s4 exynos one. Did not face any problem in s4 ever and it's my 4th day with note 3 and going great so far. In my thinking the little big processing is better than all 8 cores working at the same time and people should see that the big a15 quad core 1.9 ghz gave very very closer benchmarks test results to Snapdragon 800 2.3 ghz and some were higher. People don't understand the chip architecture and just play on rumors.
iba21 said:
Well, someone write a news without any confermation and users say it's good
So.. now.. i'm going to write a news.. santa is real! And all beleave in it..
Too many sheeps in the world
Big little is over any other arch.. it has the best efficiency ever.. and will have it with hmp..
Yup, if you turn on all 8 cores in maxfreq for 10 minutes.. phone burns.. but.. the logic of big little is: use low power cores (A7) for low tasks.. use high power cores (a15) for hugh tasks..
Than.. now you are using the cluater migration, it has 2 "bugs"
1) all cluster switch, so, is alminented the A15 even if it's not necessary, it happes when 1 core goes to A15
2) the switch from 2 clusters blocks computation for little time
The core migration fix the first problem of cluster migration
The hmp fix each 2 "problems"
So.. if cluater migration is good.. core migration is better, and hmp is better and better
Hmp has high overheating? Well, so n3 with cluster will have more overheating.. all with n3 has this issue?
for me.. all of you use too less the brain.. brain is a muscle.. use it!
Click to expand...
Click to collapse
willstay said:
I do not find thermal envelope explanation for HMP logical. Cluster Migration switches all the cores to A15 even when one thread requires power hence this design is more battery hungry. If Samsung is really worried about crossing thermal envelope, then they can implement something like Intel has done which they call it Turbo Boost. They can effectively reduce max clock speed to 1.5 GHz when all A15 are running but allow it to run to 1.9 GHz when only 1 or two threads are running.
If Samsung refuses to do so, I hope developers find ways to unlock HMP. It is not that I need 8 cores running simultaneously when my laptop hums at 1.3GHz in dual core mode, but when Samsung teases us and there is treasure hidden ready for unlocking, then it is just human nature to want MORE.
Click to expand...
Click to collapse
People doesen't understand what is an hotplug and how it works
The real goal of that arch is called "power gating".. simply it's a technique developed by intel witch AUTO shuts down transistors if those are not in use..
The hotplug uses a software decision to shut down cores.. it's not hardware..
the difference?
Simply, for linux, all cores are everytime turned on, even if the transistors of the core are shutted down.. that's prevent time spent to re-schedule the tasks.. and sure.. linux is a multithreading kernel, it means, more core = more parallelization = less frequency = less power usage
That's the real goal of big.little!
And you understand, if there are 8 cores.. tasks will be shared on more cores, it means it has the best efficiency ever
An ex.. phone ALWAYS have low tasks to be compilet to permit the phone using.. as like the audio task, video task, or, wireless taks.. well, why use an high performance arch for low performance tasks? That's why arm creates A7..
CORTEX.A7 has the best efficiency ever.. so.. the same task in a cortexA7 is compiled with LESS ENERGY than other arch.. so.. it's a perfect way to preserve energy
Sure, A7 is not a performance arch, that's why arm choses to create the A15.. when tasks are too high to be compiled with an A7.. system auto switch task on a A15.. so.. the goal is.. low tasks in A7 (best efficiency) and high tasks in A15 (best performance)
Mix best efficiency + best performance = best arch :thumbup:
Got a feeling they're testing the HMP tech / heat / performance / power consump in NOTE 3 already .........
They were brave enough to have it on display at an exhibition ..... the update seems promising from what I can see!
On the "tune.pk" site there's a "video 1058551" showing "Samsung-Exynos5420-HMP-bigLITTLE-demo" with all 8-cores running!
Hoping to see some actual benchmarks so we know what the before / after results are like!!!
gudodayn said:
Got a feeling they're testing the HMP tech / heat / performance / power consump in NOTE 3 already .........
They were brave enough to have it on display at an exhibition ..... the update seems promising from what I can see!
On the "tune.pk" site there's a "video 1058551" showing "Samsung-Exynos5420-HMP-bigLITTLE-demo" with all 8-cores running!
Hoping to see some actual benchmarks so we know what the before / after results are like!!!
Click to expand...
Click to collapse
I'd be inclined to agree with you. The Lite/Neo Note 3 has it enabled for all 6 cores apparently according to Sammobile
radicalisto said:
I'd be inclined to agree with you. The Lite/Neo Note 3 has it enabled for all 6 cores apparently according to Sammobile
Click to expand...
Click to collapse
The lady from Samsung and Samsung wasnt hiding the Note3 device either ……
My contract is up soon ……
I hope benchmarks show up soon so I can get an idea of either the LTE or 3G one to get!
If you see some of the tweets of SamsungExynos on twitter here https://twitter.com/SamsungExynos it's easy to have a wild guess HMP will be soon released, just a wild guess though looking at their tweets.
radicalisto said:
I'd be inclined to agree with you. The Lite/Neo Note 3 has it enabled for all 6 cores apparently according to Sammobile
Click to expand...
Click to collapse
5260 in note3 lite is obbly to be used with hmp.. why?
Cluster and core migrations may have the equal numbers of cores for each clusters, or kernel code won't works
I don't think if samsung will release hmp for the n900 or not.. the only thing i could say is: if samsung will relase a good source code.. I will extract the hmp from the note3 lite code , modify, and add it in my own kernel build.. and this for all kernel modders..
sure, i hope for a direct hmp upgrade by samsung, it will be better, but, we'll do all to try to run it..
I tryed to compile a code takken from internet witch tryes to load hmp.. but.. no possibilities due to an incompatibility with too mouch dependencies of the kernel code, traduct, actual source code is RUBBISH!
We are running universal drivers without any CCI code nor the real base of hmp..
source code core and cluster migration were writes TWO YEARS AGO!
We are runnin a code wroted for the 5410!! Not the 5420!!
That's why our exy suks compared to hmp features!
**** you samsung
Why do we want HMP activated ? The Android OS is not that advanced to be able to manage heat & conserve batteries even on KitKat. Enabling HMP will only ruin the phone hardware and shorting its age. Decision must have been made with careful reasonings.

Categories

Resources