More powerful Exynos chipset coming soon from Samsung - General Topics

Samsung have the luxury of making their own chips and they are quick to put out new and better versions of them. The Exynos chipset, which debuted with the Samsung Galaxy S II at a 'mere' 1.2GHz is getting a 1.5GHz version, called the Exynos 4212.
Samsung also has a pair of high-end mobile cameras headed for the production line. One is a 16MP main shooter with a back illuminated sensor for better low-light performance (expected to ship in November) and the other is a 1.2MP module with [email protected] capture capabilities for front-facing cameras.
We can't quite make out the Google-translated press release but it seems the front facing camera will have 1/8.2 sensor (that sounds pretty small, but we'll see) and the ISO of the main shooter goes up to 1,600.
Going back to Exynos, it's built using the 32nm process and was designed with 3D performance in mind. Gameloft is apparently showing interest and will offer several titles to put the new SoC to good work.
The Korea-bound Galaxy S II LTE and Galaxy S II HD LTE will sport Exynos chipsets with the CPU clocked at 1.5GHz, which makes them the most likely candidates for being the first phones with the new chipset.
Samsung already has a 1.4GHz version of Exynos that's powering the Galaxy Note and the Galaxy Tab 7.7, but there's no info what kind of change in performance we can expect in the 3D department (beyond the obvious gain from the faster clock speed).
Click to expand...
Click to collapse
http://www.gsmarena.com/samsung_announces_15ghz_exynos_chipset_16mp_camera-news-3200.php
FWAP FWAP FWAP at the highlighted bits. One thing Samsung is perfectly good at making is chips.
I think this might mean I'll give the Nexus Prime a miss and wait for the Galaxy S III. Probably jizz in my pants when I hold it.

Should be epic, loving the Samsung processors. iPhone5 should be feeling scared. Hopefully ICS will be a hit as well.
Let's see what Apple can do and whether it tempts away current galaxy users.
Sent from my GT-I9100 using XDA App

http://pocketnow.com/android/samsung-unveils-new-dual-core-exynos-4212-processor
50% increase in 3D performance. Something to rival the A5 perhaps?
I hope they put this in the Galaxy Note.

Killer Bee said:
http://pocketnow.com/android/samsung-unveils-new-dual-core-exynos-4212-processor
50% increase in 3D performance. Something to rival the A5 perhaps?
I hope they put this in the Galaxy Note.
Click to expand...
Click to collapse
The SGX 543mp2 is a lot faster than 50% compared to the Mali400. Really hoping for Kal-El in the SGS3.

Killer Bee said:
http://pocketnow.com/android/samsung-unveils-new-dual-core-exynos-4212-processor
50% increase in 3D performance. Something to rival the A5 perhaps?
I hope they put this in the Galaxy Note.
Click to expand...
Click to collapse
Something to crash the A5 for sure.The Galaxy S II's Exynos 4210 rivals the A5.
Toss3 said:
The SGX 543mp2 is a lot faster than 50% compared to the Mali400. Really hoping for Kal-El in the SGS3.
Click to expand...
Click to collapse
W-wa-wait.What?****in' ****.Where did you see this?Go check the Anandtech review.It shows the high-end mobile GPUs' performance in cold numbers and shows technical specs clearly.The 543MP2 might be 50% faster than the Adreno 220,not the Mali MP-400.

Killer Bee said:
http://pocketnow.com/android/samsung-unveils-new-dual-core-exynos-4212-processor
50% increase in 3D performance. Something to rival the A5 perhaps?
I hope they put this in the Galaxy Note.
Click to expand...
Click to collapse
I believe this is a simple die shrink from 45 to 32nm. So it still has the Mali-400 GPU but clocked at 400MHz instead of 267MHz (i e 50% clock increase). The very subtle name change from Exynos 4210 to Exynos 4212 almost confirms this.

Until the Mali-400 can compete with the SGX543MP2 in GLBenchmark at the same resolution, I'm going to wait for a Kal-El phone.

tolis626 said:
Something to crash the A5 for sure.The Galaxy S II's Exynos 4210 rivals the A5.
W-wa-wait.What?****in' ****.Where did you see this?Go check the Anandtech review.It shows the high-end mobile GPUs' performance in cold numbers and shows technical specs clearly.The 543MP2 might be 50% faster than the Adreno 220,not the Mali MP-400.
Click to expand...
Click to collapse
http://www.anandtech.com/show/4686/samsung-galaxy-s-2-international-review-the-best-redefined/17
Check the GLBenchmark numbers again; the SGX 543mp2 is over 100% faster than the Mali mp-400.

Toss3 said:
Check the GLBenchmark numbers again; the SGX 543mp2 is over 100% faster than the Mali mp-400.
Click to expand...
Click to collapse
And it's driving over twice the resolution. The GPU in the iPhone 5 is going to blow away the competition, as it's at a lower resolution compared to the iPad 2. I'm actually kind of jealous. I refuse to use a device with a lower performing GPU than an Apple product! Hope the GPU in the Kal-El will actually be competitive.

YOUCANNOTDENY said:
And it's driving over twice the resolution. The GPU in the iPhone 5 is going to blow away the competition, as it's at a lower resolution compared to the iPad 2. I'm actually kind of jealous. I refuse to use a device with a lower performing GPU than an Apple product! Hope the GPU in the Kal-El will actually be competitive.
Click to expand...
Click to collapse
No it is not; both were running the benchmarks at 1280x720. Kal-El is a lot faster than both the Mali MP-400 and SGX 543mp2 while having a lot more features let alone having five cores.
"GLBenchmark 2.1 now includes the ability to render the test offscreen at a resolution of 1280 x 720. This is not as desirable as being able to set custom resolutions since it's a bit too high for smartphones but it's better than nothing." Anandtech
EDIT: Apple might decide to cut back the A5 running inside the iPhone 5 to just one 543, or a lower clocked version, as having two doesn't really make any sense on a mobile phone at least when battery life is taken into consideration.

tjtj4444 said:
I believe this is a simple die shrink from 45 to 32nm. So it still has the Mali-400 GPU but clocked at 400MHz instead of 267MHz (i e 50% clock increase). The very subtle name change from Exynos 4210 to Exynos 4212 almost confirms this.
Click to expand...
Click to collapse
This sounds plausible.
YOUCANNOTDENY said:
And it's driving over twice the resolution. The GPU in the iPhone 5 is going to blow away the competition, as it's at a lower resolution compared to the iPad 2. I'm actually kind of jealous. I refuse to use a device with a lower performing GPU than an Apple product! Hope the GPU in the Kal-El will actually be competitive.
Click to expand...
Click to collapse
There is a developer tablet called ODROID-A that uses the Exynos 4210 SoC and it benches pretty well with an even higher resolution than the iPad 2 (1366x768 vs 1024x768).
Comparison for reference.
With a 50% increase, the Mali-400 (assuming they keep this GPU) will be comparable to the SGX-543MP2.

Toss3 said:
http://www.anandtech.com/show/4686/samsung-galaxy-s-2-international-review-the-best-redefined/17
Check the GLBenchmark numbers again; the SGX 543mp2 is over 100% faster than the Mali mp-400.
Click to expand...
Click to collapse
Well,for all I know,the CPU in the iPad 2 is larger(The chip itself is larger),which instantly translates in more transistors being placed in the same space.I can only suspect that same goes for the GPU.But even if it does not,it's most likely that the version in the iPhone 5/4s/whatever won't be the same.Will it be underclocked?Smaller?I don't know.
By the way,I personally believe manufacturers **** all over their tablets when they put phone SoC's in them.There should be a different,more powerful(albeit more power-hungry) variant of each SoC for tablets.But that's just me.
As for the iPhone part,don't mistake me as a hater(I hate it btw,but I won't flame it or anything.If it's better than what I have,I'll just admit it.It's not the hardware I hate).In fact I wish it's THAT powerful,so that the competition will drive performance up for everyone.And that,for us,is a win.

Related

[INFO/Q] HTC Sensetion only 1900 points with

smartbench 2011 Productivity test
http://smartphonebenchmarks.com/ind...11:Productivity&filter_cpu=all&filter_gpu=all
gpu score i might understand why its low cos the high res but why the Productivity is so low ?
i guess HTC didnt put faster NAND ROM
Evo3D did 2000
someone maybe know what the problem or cause ?
Proz00 said:
smartbench 2011 Productivity test
http://smartphonebenchmarks.com/ind...11:Productivity&filter_cpu=all&filter_gpu=all
gpu score i might understand why its low cos the high res but why the Productivity is so low ?
i guess HTC didnt put faster NAND ROM
Evo3D did 2000
someone maybe know what the problem or cause ?
Click to expand...
Click to collapse
The reason is...
The CPU is cortex 8.
Tegra 2 and the new Samsung processors are Cortex 9.
Coretex 9 is a PRETTY big improvement over cortex.
Once again HTC is going for garbage hardware
What is in the sensation is 2 Desire HD CPUS oC to 1.2 Ghz + better GPU.
What is in the SGS2 is 2 MUCH better Hummingbird CPUs OC to 1.2 + MUCH better GPU
the cpu is neither a cortex a8 nor a cortex a9. it will provide plenty of performance and will be competitive with other dual cores.
the adreno 220 gpu that comes with the sensation is faster than the mali gpu that comes with the sgs2 when looking at preliminary tests done by anandtech.
whether it will be the fastest or slowest dual core soc will have to wait until its released, and benchmarks often only tell part of the story. but certainly it will provide far more performance than any of the single core soc's we have right now and will provide much satisfaction from its owners.
kaiserkannon said:
the cpu is neither a cortex a8 nor a cortex a9. it will provide plenty of performance and will be competitive with other dual cores.
the adreno 220 gpu that comes with the sensation is faster than the mali gpu that comes with the sgs2 when looking at preliminary tests done by anandtech.
whether it will be the fastest or slowest dual core soc will have to wait until its released, and benchmarks often only tell part of the story. but certainly it will provide far more performance than any of the single core soc's we have right now and will provide much satisfaction from its owners.
Click to expand...
Click to collapse
Huh? I'm confused.
Is the cpu not based on arms cortex a8? Just a slightly modified version. It is identical to the Single core Snapdragon in the Desire HD.
The benchmarks so far don't make it seem too be as competitive as the Tegra 2 OR orion.
Samsung has said that the Mali 400 is MUCH faster then the current hummingbird GPU. Current benchmarks say that it is infact SLOWER...
I doubt samsung would release the Orion with a GPU SLOWER then its previous gen... that just makes no sense. If that is the case then Tegra might be king. If the Mali 400 IS much better tho, samsung will have the best SoC.
The CPU in the Sensation is ROUGHLY... 2.4 ghz. Compare that to the Desire HD stable OC of 1.8 ghz.
What is left to be seen is how much the CPU can be OC'd.
I would think that it would be less then 1.8 ghz each core. But thats yet tooo bee seen.
Regardless of what you think... the HTC sensation CPU will be slower then the competitions.
EDIT: Forgot to mention that the Sensation CPU should have the same battery life as the current single core Snapdragon... however it is pushing more pixels sooo..
Samsung should have mated its Orion to Hummingbird gpu. Hummingbird was great
Sent from my MB860 using XDA App
Maedhros said:
The benchmarks so far don't make it seem too be as competitive as the Tegra 2 OR orion.
Click to expand...
Click to collapse
Dunno where you got your information from, but it's very competitive with the Tegra 2. (8660 is the CDMA version of the Sensation's 8260). From these benchmarks, we also know that an overclock of at least 1.5GHz will be perfectly viable--the chip was designed for that anyhow.
Debating A8 vs A9 is a trivial matter, because it's a tiny fraction of the entire picture.
Wondering if cm7 can help the score
First, that Anandtech benchmark is not a good measuring stick. Anandtech benched the MDP that had the 8660 running at 1.5 GHz and 800x480 so the results are higher than what Sensation can achieve because Sensations runs at a lower clock and higher resolution.
Second, Qualcomm 8260/8660 is A8 Cortex. Tegra 2, OMAP4 and Exynos are A9 Cortex based. Claims that Qualcomm doesn't use the ARM architecture is a lie.
Never trust smartbench. Period.
GLbenchmark is more trustworthy.
Sent via psychic transmittion.
t-mizzle said:
First, that Anandtech benchmark is not a good measuring stick. Anandtech benched the MDP that had the 8660 running at 1.5 GHz and 800x480 so the results are higher than what Sensation can achieve because Sensations runs at a lower clock and higher resolution.
Second, Qualcomm 8260/8660 is A8 Cortex. Tegra 2, OMAP4 and Exynos are A9 Cortex based. Claims that Qualcomm doesn't use the ARM architecture is a lie.
Click to expand...
Click to collapse
The scorpion core in snapdragon socs use the arm v7 instruction set that both the a8 and a9 use, but it is not an a8 or an a9, it is qualcomms own design.
And personally I like comparing the different chips in these phones at the same resolution to see which chip has better performance on a level playing field. But yeah the sensation will have a bit worse performance thanks to higher resolution. Like the atrix vs optimus 2x. But to me the higher resolution is completely worth the hit in performance.
TeroZ said:
Never trust smartbench. Period.
Click to expand...
Click to collapse
Would you care to elaborate on this please?
GLbenchmark is more trustworthy.
Sent via psychic transmittion.
Click to expand...
Click to collapse
GLBench is a decent 3D benchmark app, but it is just that - it tests only the GPU. Smartbench was designed to test both CPU (inc. dual-core ones) and GPU, hence reporting two numbers. IMO, you are not comparing apples to apples unless you were only referring to the GPU portion of the test.
kaiserkannon said:
The scorpion core in snapdragon socs use the arm v7 instruction set that both the a8 and a9 use, but it is not an a8 or an a9, it is qualcomms own design.
And personally I like comparing the different chips in these phones at the same resolution to see which chip has better performance on a level playing field. But yeah the sensation will have a bit worse performance thanks to higher resolution. Like the atrix vs optimus 2x. But to me the higher resolution is completely worth the hit in performance.
Click to expand...
Click to collapse
Stop spreading FUD. MSM 8260/8660 is not capable of out of order execution. Cortex A9 supports this feature, A8 does not.
MSM 8260/8660 Pipeline Depth is 13 stages, therefor it's clearly a A8 Cortex.
A9 was a successor to the A8 and it's a significant improvement over it.
t-mizzle said:
Stop spreading FUD. MSM 8260/8660 is not capable of out of order execution. Cortex A9 supports this feature, A8 does not.
MSM 8260/8660 Pipeline Depth is 13 stages, therefor it's clearly a A8 Cortex.
A9 was a successor to the A8 and it's a significant improvement over it.
Click to expand...
Click to collapse
qualcomm disagrees with you though. they state that it is not based on the a8 and has partial out of order execution. it also has a 128 bit wide neon data path for neon instructions in comparison to the 64 bit wide path in a8 and a9 designs. while there are some similarities to the a8 as you pointed out, the scorpion is not qualcomm's implementation of an a8. and it has some advantages over both a8 and a9. and some disadvantes to an a9. overall the a9 will probably be a bit faster clock for clock, but the scorpion cores in the snapdragon dual cores are clocked faster.
this is very much the same as amd and intel. they both use the same instruction set (x86), but their processors are not the same. qualcomm simply licenses the instruction set (armv7) and builds its own processor. while other companies like nvidia, TI, and samsung buy the cortex a8 or a9 design from ARM and build a copy of it.
Acei said:
Would you care to elaborate on this please?
GLBench is a decent 3D benchmark app, but it is just that - it tests only the GPU. Smartbench was designed to test both CPU (inc. dual-core ones) and GPU, hence reporting two numbers. IMO, you are not comparing apples to apples unless you were only referring to the GPU portion of the test.
Click to expand...
Click to collapse
You are right. But smartbench rank scorpion+adreno205 lower than DX with [email protected] is definitely nonsense.
For gpu, go glbenchmark or nenamark or an3dbench whatever but smartbench.
For cpu, crunching pi or linpack is more reliable.
Smartbench does not reflect any real world performance.
Sent via psychic transmittion.
Thracks said:
Dunno where you got your information from, but it's very competitive with the Tegra 2. (8660 is the CDMA version of the Sensation's 8260). From these benchmarks, we also know that an overclock of at least 1.5GHz will be perfectly viable--the chip was designed for that anyhow.
Debating A8 vs A9 is a trivial matter, because it's a tiny fraction of the entire picture.
Click to expand...
Click to collapse
Based on glbenchmark score the anand tests might be suspect. It was score 6% higher than tegra 2 not double like anand's test. Or qcomm might be monkeying with things.If that is the case I am going to have a big problem with qcomm products.
Maybe smartbench is right and the nand quality is poor?
The sense experience on it wasn't done. It would have to score higher than the mytouch and previous devices its dual core. Most likely a crappy engineering build on it.
Sent from my HTC Glacier using XDA Premium App
TeroZ said:
You are right. But smartbench rank scorpion+adreno205 lower than DX with [email protected] is definitely nonsense.
Click to expand...
Click to collapse
There are other benchmark apps that rank your combo in the same order as Smartbench in graphical tests. Plus, please do look at the productivity tests for Smartbench 2011 more carefully. Typical Scorpion based phone score slightly higher results on Scorpions than DX. Even games like Dungeon Defender (a graphically heavy game) ranks both as "mid-range", while ranking Galaxy S series as "high-end".
For gpu, go glbenchmark or nenamark or an3dbench whatever but smartbench.
For cpu, crunching pi or linpack is more reliable.
Smartbench does not reflect any real world performance.
Click to expand...
Click to collapse
Calculating Pi is a very very simple, narrow, and one-dimensioned test. Linpack is heavy on floating point calculations. If that is what you want to know, then I have no issues with that. But do your day-to-day tasks on your phones translate to pure floating point calculations on your phones? They don't. That's why I've included several tests and will be including more as new versions are updated in the future. Plus, I believe none of them uses more than 1 core.
I'm open to suggestions and criticisms - but please do provide more details.
Latest benchmarks made by a retail GSII which has an ORION Exynos talks by themselves
http://forum.xda-developers.com/showpost.php?p=13096662&postcount=383
Exynos at "only" 1.2Ghz is even better than adreno 220 SCORPION 1.5Ghz chip as it score 41 fps whereas the latter is scoring 38 fps in GLBenchmark EGYPT standard test
http://images.anandtech.com/graphs/graph4243/36161.png
http://nsa25.casimages.com/img/2011/04/21/110421112944690206.png
So the HTC Sensation which is underclocked to 1.2Ghz and have a bigger resolution will look like shayt, SGSII With Exynos will rule for a long long time...
touness69 said:
Latest benchmarks made by a retail GSII which has an ORION Exynos talks by themselves
http://forum.xda-developers.com/showpost.php?p=13096662&postcount=383
Exynos at "only" 1.2Ghz is even better than adreno 220 SCORPION 1.5Ghz chip as it score 41 fps whereas the latter is scoring 38 fps in GLBenchmark EGYPT standard test
http://images.anandtech.com/graphs/graph4243/36161.png
http://nsa25.casimages.com/img/2011/04/21/110421112944690206.png
So the HTC Sensation which is underclocked to 1.2Ghz and have a bigger resolution will look like shayt, SGSII With Exynos will rule for a long long time...
Click to expand...
Click to collapse
Thanks for this.
Looks like this is another HTC phone with a disappointing CPU & GPU

Exactly how good is this Qualcomm Processor?

Seems with every smartphone that comes to the USA it gets some sort of Snapdragon Processor by Qualcomm and people do nothing but complain. So how does this Snapdragon S4 processor compare to every other dual-core processor out there and even the Tegra 3? Looked up some benchmarks and both seem to have their advantages and disadvantages. But what I really want to know is which one is better for real world performance, such as battery life, transitional effects, and launching apps. Couple people said Sense 4 is very smooth and "has LITTLE to no lag"? How does this processor display web pages in Chrome?
Read the thread "Those of your who are waiting too compare GSIII to HTC One X" in this forum. It only has about 6 pages but has a ton of information. Short answer is that the Qualcomm chip kicks serious ass.
Sent from my Desire HD using XDA
shaboobla said:
Short answer is that the Qualcomm chip kicks serious ass.
Sent from my Desire HD using XDA
Click to expand...
Click to collapse
+1
After reading through that thread I'm still not entirely clear. Seems the Tegra is better for gaming?
MattMJB0188 said:
After reading through that thread I'm still not entirely clear. Seems the Tegra is better for gaming?
Click to expand...
Click to collapse
yes and no, the tegra 3 does have a better gpu so in theory, better games. however, game makers cater to the mass. most androids that are active are mid-range, android 2.2 or 2.3, have a resolution of 480x800, and last years (or older) processors. although most will be made to work on the t3 and s4, it will be compatibility issues, not optimization. nvidia will have a couple games "t3 only" but even those will be made to work on other phones. now that ics is cleaning up some of the splintering of apps, we'll see some better options on both fields.
in short, yes the t3 is a better gaming chip. but for the battery life, games available, and current bugs i would suggest the s4. i may change my mind when the refreshs come out q3-4, we'll see.
MattMJB0188 said:
After reading through that thread I'm still not entirely clear. Seems the Tegra is better for gaming?
Click to expand...
Click to collapse
Correct. However, most games are not optimized to utilize the Tegra to its fullest potential. That should change by the end of the year. The other point is that the S4 is just as good as the Tegra un terms of gaming performance. IMO, you should decide between these 2 processors by looking at the main area where the S4 truly has the advantage thus far, and that is battery life. So far, the battery life advantage goes to the S4. Just read the battery life threads in this forum and for the international X. It took a few updates to the Transformer Prime to start having pretty good battery life. The One X, will get better in that department with a couple more updates for battery optimization. The S4 starts with great battery life and will get even better in that department.
Sent from my HTC Vivid using XDA app
I say the snapdragon S4 is a better chip right now. The tegra 3 gpu is great and with the tegra zone games it really looks great. But he 4 cores CPU is really for heavy multitasking so you candivise the work between all four cores. They are A9 cores vs the custom qualcomm which is close to A15. It mans that for single threaded task and multi threaded task the snapdragon will whoop tegra 3' ass. Opening an app, scrolling through that app sect... also browser performance is slightly better on the qualcomm chip. Basically tegra 3 can do lots of things at the same time with decent speed vs the S4 chip which can do 1 or few more things at lighting speed.
The S4 is almost 2x faster than any other dual core out there. Anandtech did a few nice articles on the S4, including benchmarks vs tegra 3.
In real use, the S4 should be much better, because not all apps are multithreaded for 4 cores. The S4 completely kicks the Tegra 3's ass in singlethreaded benchmarks. I also expect the S4 to be better at power management, because it is made on 28nm node, instead of 40 nm, so its more compact and efficient.
About 23 I'd say
Sent from my SGH-I997 using xda premium
Here is a comparison benchmark by someone from Reddit.
Benchmark S4 Krait Tegra 3
Quadrant 5016 4906
Linpack Single 103.11 48.54
Linpack Multi 212.96 150.54
Nenamark 2 59.7fps 47.6fps
Nenamark 1 59.9fps 59.5fps
Vellamo 2276 1617
SunSpider 1540.0ms 1772.5ms
Sadly, can't do much for the formatting. Enjoy.
The difference in DMIP's is where the S4 really whomps on the T3. All the T3 has going for it at the moment is it's GPU. If you don't care about some additional gaming prowess, the S4 is the way to go.
tehdef said:
Here is a comparison benchmark by someone from Reddit.
Benchmark S4 Krait Tegra 3
Quadrant 5016 4906
Linpack Single 103.11 48.54
Linpack Multi 212.96 150.54
Nenamark 2 59.7fps 47.6fps
Nenamark 1 59.9fps 59.5fps
Vellamo 2276 1617
SunSpider 1540.0ms 1772.5ms
Sadly, can't do much for the formatting. Enjoy.
The difference in DMIP's is where the S4 really whomps on the T3. All the T3 has going for it at the moment is it's GPU. If you don't care about some additional gaming prowess, the S4 is the way to go.
Click to expand...
Click to collapse
Just to add to that and to be fair, S4 is at around 7000 at antutu benchmark while tegra 3 is at around 10000. I still prefer the S4
Eh...
It wins in 1 benchmark specifically enabled to take advantage of more than 2 cores. So if you want to play tegrazone games and have some basic lag, the T3 is for you. If you want to have a near flawless phone experience, and have decreased graphical performance in some wanna be console games, then the S4 is the way to go.
Actually you wont really notice the lack of graphics performance on the snapdragon s4. Its about 10% slower in most benchmarks but outperforms the tegra3 in a few as well. However i have a sensation xl with the adreno 205 which is only a quarter as fast as the adreno 225 and all games including deadspace, frontline, blood glory runs smoothly on it. To say the snapdragon s4 is inferior because of the slower Adreno 225 is really nit picking to me. For me bigger reason to choose one graphics chip over another is flash performance and this is where the exynos mali 400 kicks the adreno 225 in the balls. It handles 1080p youtube videos in browser without a hiccup while the 225 chokes even on 720p content.
Let me answer this. How good is it? More than good enough. Almost all apps and games are catered to weaker phones so the T3 and S4 are both more than good enough.
And my two cents, the S4 beats tegra 3
MattMJB0188 said:
Seems with every smartphone that comes to the USA it gets some sort of Snapdragon Processor by Qualcomm and people do nothing but complain. So how does this Snapdragon S4 processor compare to every other dual-core processor out there and even the Tegra 3? Looked up some benchmarks and both seem to have their advantages and disadvantages. But what I really want to know is which one is better for real world performance, such as battery life, transitional effects, and launching apps. Couple people said Sense 4 is very smooth and "has LITTLE to no lag"? How does this processor display web pages in Chrome?
Click to expand...
Click to collapse
Let me start by saying I'm not a pro when it comes to electronics but I do have an understanding on the subject.
The thing to realize about these processors, and most other processors available today, is that the s4 is based on the cortex a15 while the tegra 3 along with the new Samsung are based on the a9. The a15, at the same Hz and die size is 40% faster than the a9.
S4 = dual core Cortex A15 @ 1.5GHz - 28NM
Tegra3 = quad core Cortex A9 @ 1.5GHz - 40NM
Exynos 4(Samsung) = quad core Cortex A9 @ 1.5GHz - 32NM
S4 so far, in theory, is 40% faster per core, but having two less. Individual apps will run faster unless they utilize all four cores on the tegra3. Because the s4 has a smaller die size, it will consume less energy per core.
The actual technology behind these chips that the manufacturers come up with will also affect the performance output, but the general idea is there. Hope that helps to understand a little better how the two chips will differ in performance.
Sent from my shiny One XL
The S4 compared to the Tegra3 says it all. dualcore that beats a quadcore in almost everything.
Intel released the first native dual core processor in 2006 and shortly thereafter released a quad core which was basically two dual cores fused together (this is what current ARM quads are like).
That was 6 years ago and these days pretty much all new desktop computers come with quad cores while laptops mostly stick with dual. Laptops make up the biggest share of PC sales so for your everyday PC usage, you'll be more than comfortable with a dual core.
You really can't assume mobile SoCs will follow the same path, but it's definitely something to consider. I think dual core A15-based SoCs will still rule the day this year and next at the very least.
I was really on the fence about the X or the XL. But the S4 got me. Not having 32GB is already bugging me. But the efficiency (and my grandfathered unlimited data paired with Google Music) is definitely worth the sacrifice. Very happy so far! Streaming Slacker, while connected to my A2DP stereo, running GPS was great. I'm not a huge gamer though. I miss Super Mario Bros being the hottest thing!
krepler said:
Let me start by saying I'm not a pro when it comes to electronics but I do have an understanding on the subject.
The thing to realize about these processors, and most other processors available today, is that the s4 is based on the cortex a15 while the tegra 3 along with the new Samsung are based on the a9. The a15, at the same Hz and die size is 40% faster than the a9.
S4 = dual core Cortex A15 @ 1.5GHz - 28NM
Tegra3 = quad core Cortex A9 @ 1.5GHz - 40NM
Exynos 4(Samsung) = quad core Cortex A9 @ 1.5GHz - 32NM
S4 so far, in theory, is 40% faster per core, but having two less. Individual apps will run faster unless they utilize all four cores on the tegra3. Because the s4 has a smaller die size, it will consume less energy per core.
The actual technology behind these chips that the manufacturers come up with will also affect the performance output, but the general idea is there. Hope that helps to understand a little better how the two chips will differ in performance.
Sent from my shiny One XL
Click to expand...
Click to collapse
correct me if im wrong but all 3 are A9 based including the S4. the first A15 will be the Exynos 5250, a dual core.
Tankmetal said:
correct me if im wrong but all 3 are A9 based including the S4. the first A15 will be the Exynos 5250, a dual core.
Click to expand...
Click to collapse
This is inaccurate.
The Exynos 4 and the Tegra 3 are based on the ARM A9 reference design.
The Qualcomm Snapdragon S4 is "roughly equivalent" to the A15, but not based on the A15. The same was true for Qualcomm's old S3 (which was equivalent to something between the A8 and A9 design)
One thing that most people don't realize is that Qualcomm is one of the very few companies that designs its own processors based on the ARM instruction set, and while S4's is similar to the A15 in terms of architecture, it's actually arguably better than the ARM reference design (e.g. asynchronous clocking of each core which is a better design than the big.LITTLE or +1 design).

TEGRA 4 - 1st possible GLBenchmark!!!!!!!! - READ ON

Who has been excited by the Tegra 4 rumours?, last night's Nvidia CES announcement was good, but what we really want are cold-hard BENCHMARKS.
I found an interesting mention of Tegra T114 SoC on a Linux Kernel site, which I've never heard of. I got really interested when it stated that the SoC is based on ARM A15 MP, it must be Tegra 4. I checked the background of the person who posted the kernel patch, he is a senior Nvidia Kernel engineer based in Finland.
https://lkml.org/lkml/2012/12/20/99
"This patchset adds initial support for the NVIDIA's new Tegra 114
SoC (T114) based on the ARM Cortex-A15 MP. It has the minimal support
to allow the kernel to boot up into shell console. This can be used as
a basis for adding other device drivers for this SoC. Currently there
are 2 evaluation boards available, "Dalmore" and "Pluto"."
On the off chance I decided to search www.glbenchmark.com for the 2 board names, Dalmore (a tasty whisky!) and Pluto (Planet, Greek God and cartoon dog!) Pluto returned nothing, but Dalmore returned a device called 'Dalmore Dalmore' that was posted on 3rd January 2013. However the OP had already deleted them, but thanks to Google Cache I found the results
RESULTS
GL_VENDOR NVIDIA Corporation
GL_VERSION OpenGL ES 2.0 17.01235
GL_RENDERER NVIDIA Tegra
From System spec, It runs Android 4.2.1, a Min frequency of 51 MHz and Max of 1836 Mhz
Nvidia DALMORE
GLBenchmark 2.5 Egypt HD C24Z16 - Offscreen (1080p) : 32.6 fps
iPad 4
GLBenchmark 2.5 Egypt HD C24Z16 - Offscreen (1080p): 49.6 fps
CONCLUSION
Anandtech has posted that Tegra 4 doesn't use unified shaders, so it's not based on Kepler. I reckon that if Nvidia had a brand new GPU they would have shouted about it at CES, the results I've found indicate that Tegra 4 is between 1 to 3 times faster than Tegra 3.
BUT, this is not 100% guaranteed to be a Tegra 4 system, but the evidence is strong that it is a T4 development board. If this is correct, we have to figure that it is running beta drivers, Nexus 10 is ~ 10% faster than the Arndale dev board with the same Exynos 5250 SoC. Even if Tegra 4 gets better drivers, it seems like the SGX 544 MP4 in the A6X is still the faster GPU, with Tegra 4 and Mali T604 being an almost equal 2nd. Nvidia has said that T4 is faster than A6X, but the devil is in the detail, in CPU benchmarks I can see that being true, but not for graphics.
UPDATE - Just to add to the feeling that that this legit, the GLBenchmark - System section lists the "android.os.Build.USER" as buildbrain. Buildbrain according to a Nvidia job posting is "Buildbrain is a mission-critical, multi-tier distributed computing system that performs mobile builds and automated tests each day, enabling NVIDIA's high performance development teams across the globe to develop and deliver NVIDIA's mobile product line"
http://jobsearch.naukri.com/job-lis...INEER-Nvidia-Corporation--2-to-4-130812500024
I posted the webcache links to GLBenchmark pages below, if they disappear from cache, I've saved a copy of the webpages, which I can upload, Enjoy
GL BENCHMARK - High Level
http://webcache.googleusercontent.c...p?D=Dalmore+Dalmore+&cd=1&hl=en&ct=clnk&gl=uk
GL BENCHMARK - Low Level
http://webcache.googleusercontent.c...e&testgroup=lowlevel&cd=1&hl=en&ct=clnk&gl=uk
GL BENCHMARK - GL CONFIG
http://webcache.googleusercontent.c...Dalmore&testgroup=gl&cd=1&hl=en&ct=clnk&gl=uk
GL BENCHMARK - EGL CONFIG
http://webcache.googleusercontent.c...almore&testgroup=egl&cd=1&hl=en&ct=clnk&gl=uk
GL BENCHMARK - SYSTEM
http://webcache.googleusercontent.c...ore&testgroup=system&cd=1&hl=en&ct=clnk&gl=uk
OFFSCREEN RESULTS
http://webcache.googleusercontent.c...enchmark.com+dalmore&cd=4&hl=en&ct=clnk&gl=uk
http://www.anandtech.com/show/6550/...00-5th-core-is-a15-28nm-hpm-ue-category-3-lte
Is there any Gpu that could outperform iPad4 before iPad5 comes out? adreno 320, t Mali 604 now tegra 4 aren't near it. Qualcomm won't release anything till q4 I guess, and tegra 4 has released too only thing that is left is I guess is t Mali 658 coming with exynos 5450 (doubtfully when it would release, not sure it will be better )
Looks like apple will hold the crown in future too .
i9100g user said:
Is there any Gpu that could outperform iPad4 before iPad5 comes out? adreno 320, t Mali 604 now tegra 4 aren't near it. Qualcomm won't release anything till q4 I guess, and tegra 4 has released too only thing that is left is I guess is t Mali 658 coming with exynos 5450 (doubtfully when it would release, not sure it will be better )
Looks like apple will hold the crown in future too .
Click to expand...
Click to collapse
There was a great article on Anandtech that tested the power consumption of the Nexus 10's Exynos 5250 SoC, it showed that both the CPU and GPU had a TDP of 4W, making a theoretical SoC TDP of 8W. However when the GPU was being stressed by running a game, they ran a CPU benchmark in the background, the SoC quickly went up to 8W, but the CPU was quickly throttled from 1.7 GHz to just 800 Mhz as the system tried to keep everything at 4W or below, this explained why the Nexus 10 didn't benchmark as well as we wished.
Back to the 5450 which should beat the A6X, trouble is it has double the CPU & GPU cores of the 5250 and is clocked higher, even on a more advanced 28nm process, which will lower power consumption I feel that system will often be throttled because of power and maybe heat concerns, so it looks amazing on paper but may disappoint in reality, and a 5450 in smartphone is going to suffer even more.
So why does Apple have an advantage?, well basically money, for a start mapple fans will pay more for their devices, so they afford to design a big SoC and big batteries that may not be profitable to other companies. Tegra 4 is listed as a 80mm2 chip, iPhone 5 is 96mm2 and A6X is 123mm2, Apple can pack more transistors and reap the GPU performance lead, also they chosen graphics supplier Imagination Technologies have excellent products, Power VR Rogue will only increase Apple's GPU lead. They now have their own chip design team, the benefit for them has been their Swift core is almost as powerful as ARM A15, but seems less power hungry, anyway Apple seems to be happy running slower CPUs compared to Android. Until an Android or WP8 or somebody can achieve Apple's margins they will be able to 'buy' their way to GPU domination, as an Android fan it makes me sad:crying:
32fps is no go...lets hope it's not final
hamdir said:
32fps is no go...lets hope it's not final
Click to expand...
Click to collapse
It needs to, but it will be OK for a new Nexus 7
still faster enough for me, I dont game alot on my nexus 7.
I know I'm taking about phones here ... But the iPhone 5 GPU and adreno 320 are very closely matched
Sent from my Nexus 4 using Tapatalk 2
italia0101 said:
I know I'm taking about phones here ... But the iPhone 5 GPU and adreno 320 are very closely matched
Sent from my Nexus 4 using Tapatalk 2
Click to expand...
Click to collapse
From what I remember the iPhone 5 and the new iPad wiped the floor with Nexus 4 and 10. The ST-Ericsson Nova A9600 is likely to have a PowerVR Rogue GPU. Just can't wait!!
adityak28 said:
From what I remember the iPhone 5 and the new iPad wiped the floor with Nexus 4 and 10. The ST-Ericsson Nova A9600 is likely to have a PowerVR Rogue GPU. Just can't wait!!
Click to expand...
Click to collapse
That isn't true , check glbenchmark , in the off screen test the iPhone scored 91 , the nexus 4 scored 88 ... That ksnt wiping my floors
Sent from my Nexus 10 using Tapatalk HD
Its interesting how even though nvidia chips arent the best we still get the best game graphics because of superior optimization through tegra zone. Not even the a6x is as fully optimized.
Sent from my SAMSUNG-SGH-I727 using xda premium
ian1 said:
Its interesting how even though nvidia chips arent the best we still get the best game graphics because of superior optimization through tegra zone. Not even the a6x is as fully optimized.
Sent from my SAMSUNG-SGH-I727 using xda premium
Click to expand...
Click to collapse
What sort of 'optimisation' do you mean? un optimised games lag that's a big letdown and tegra effects can also be used on other phones too with chain fire 3d I use it and tegra games work without lag with effects and I don't have a tegra device
With a tegra device I am restricted to optimised games mostly
The graphic performance of NVIDIA SoCs is always disappointed, sadly for the VGA dominanting provider on the world.
The first Tegra2, the GPU is a little bit better than SGX540 of GalaxyS a little bit in benchmark, but lacking NEON support.
The second one Tegra 3, the GPU is nearly the same as the old Mali400Mp4 in GALAXY S2/Original Note.
And now it's better but still nothing special and outperformed soon (Adreno 330 and next-gen Mali)
Strongest PowerVR GPUs are always the best, but sadly they are exclusive for Apple only (SGX543 and maybe SGX 554 also, only Sony ,who has the cross-licencing with Apple, has it in PS Vita and in PS Vita only)
tegra optimization porting no longer works using chainfire, this is now a myth
did u manage to try shadowgun thd, zombie driver or horn? the answer is no, games that use t3 sdk for physx and other cpu graphics works can not be forced to work on other devices, equally chainfire is now outdated and no longer updated
now about power vr they are only better in real multicore configuration which is only used by apple and Sony's vita, eating large die area, ie actual multicore each with its own subcores/shaders, if tegra was used in real multi core it would destroy all
finally this is really funny all this doom n gloom because of an early discarded development board benchmark, I dont mean to take away from turbo's thunder and his find but truly its ridiculous the amount of negativity its is collecting before any type of final device benchs
adrena 220 doubled in performance after the ICS update on sensation
t3 doubled the speed of t2 gpu with only 50% the number of shaders so how on earth do you believe only 2x the t3 scores with 600% more shaders!!
do you have any idea how miserable the ps3 performed in its early days? even new desktop GeForces perform much less than expected until the drivers are updated
enough with the FUD! seems this board is full of it nowadays and so little reasoning...
For goodness sake, this isn't final hardware, anything could change. Hung2900 knows nothing, what he stated isn't true. Samsung has licensed PowerVR, it isn't just stuck to Apple, just that Samsung prefers using ARMs GPU solution. Another thing I dislike is how everyone is comparing a GPU in the iPad 4 (SGX554MP4) that will NEVER arrive in a phone compared a Tegra 4 which will arrive in a phone. If you check OP link the benchmark was posted on the 3rd of January with different results (18fps then 33fps), so there is a chance it'll rival the iPad 4. I love Tegra as Nvidia is pushing developers to make more better games for Android compared to the 'geeks' *cough* who prefers benchmark results, whats the point of having a powerful GPU if the OEM isn't pushing developers to create enhance effect games for there chip.
Hamdir is correct about the GPUs, if Tegra 3 was around 50-80% faster than Tegra 2 with just 4 more cores, I can't really imagine it only being 2x faster than Tegra 3. Plus its a 28nm (at around 80mm2 just a bit bigger than Tegra 3, smaller than A6 90mm2) along with the dual memory than single on Tegra 2/3.
Turbotab said:
There was a great article on Anandtech that tested the power consumption of the Nexus 10's Exynos 5250 SoC, it showed that both the CPU and GPU had a TDP of 4W, making a theoretical SoC TDP of 8W. However when the GPU was being stressed by running a game, they ran a CPU benchmark in the background, the SoC quickly went up to 8W, but the CPU was quickly throttled from 1.7 GHz to just 800 Mhz as the system tried to keep everything at 4W or below, this explained why the Nexus 10 didn't benchmark as well as we wished.
Back to the 5450 which should beat the A6X, trouble is it has double the CPU & GPU cores of the 5250 and is clocked higher, even on a more advanced 28nm process, which will lower power consumption I feel that system will often be throttled because of power and maybe heat concerns, so it looks amazing on paper but may disappoint in reality, and a 5450 in smartphone is going to suffer even more.
So why does Apple have an advantage?, well basically money, for a start iSheep will pay more for their devices, so they afford to design a big SoC and big batteries that may not be profitable to other companies. Tegra 4 is listed as a 80mm2 chip, iPhone 5 is 96mm2 and A6X is 123mm2, Apple can pack more transistors and reap the GPU performance lead, also they chosen graphics supplier Imagination Technologies have excellent products, Power VR Rogue will only increase Apple's GPU lead. They now have their own chip design team, the benefit for them has been their Swift core is almost as powerful as ARM A15, but seems less power hungry, anyway Apple seems to be happy running slower CPUs compared to Android. Until an Android or WP8 or somebody can achieve Apple's margins they will be able to 'buy' their way to GPU domination, as an Android fan it makes me sad:crying:
Click to expand...
Click to collapse
Well said mate!
I can understand what you feel, nowdays android players like samsung,nvidia are focusing more on CPU than GPU.
If they won't stop soon and continued to use this strategy they will fail.
GPU will become bottleneck and you will not be able use the cpu at its full potential. (Atleast when gaming)
i have Galaxy S2 exynos 4 1.2Ghz and 400mhz oc mali gpu
In my analysis most modern games like MC4,NFS:MW aren't running at 60FPS at all thats because GPU always have 100% workload and CPU is relaxing there by outputing 50-70% of total CPU workload
I know some games aren't optimize for all android devices as opposed to apple devices but still even high-end android devices has slower gpu (than ipad 4 atleast )
AFAIK, Galaxy SIV is likely to pack T-604 with some tweaks instead of mighty T-658 which is still slower than iPAddle 4
Turbotab said:
There was a great article on Anandtech that tested the power consumption of the Nexus 10's Exynos 5250 SoC, it showed that both the CPU and GPU had a TDP of 4W, making a theoretical SoC TDP of 8W. However when the GPU was being stressed by running a game, they ran a CPU benchmark in the background, the SoC quickly went up to 8W, but the CPU was quickly throttled from 1.7 GHz to just 800 Mhz as the system tried to keep everything at 4W or below, this explained why the Nexus 10 didn't benchmark as well as we wished.
Back to the 5450 which should beat the A6X, trouble is it has double the CPU & GPU cores of the 5250 and is clocked higher, even on a more advanced 28nm process, which will lower power consumption I feel that system will often be throttled because of power and maybe heat concerns, so it looks amazing on paper but may disappoint in reality, and a 5450 in smartphone is going to suffer even more.
So why does Apple have an advantage?, well basically money, for a start iSheep will pay more for their devices, so they afford to design a big SoC and big batteries that may not be profitable to other companies. Tegra 4 is listed as a 80mm2 chip, iPhone 5 is 96mm2 and A6X is 123mm2, Apple can pack more transistors and reap the GPU performance lead, also they chosen graphics supplier Imagination Technologies have excellent products, Power VR Rogue will only increase Apple's GPU lead. They now have their own chip design team, the benefit for them has been their Swift core is almost as powerful as ARM A15, but seems less power hungry, anyway Apple seems to be happy running slower CPUs compared to Android. Until an Android or WP8 or somebody can achieve Apple's margins they will be able to 'buy' their way to GPU domination, as an Android fan it makes me sad:crying:
Click to expand...
Click to collapse
Typical "isheep" reference, unnecessary.
Why does apple have the advantage? Maybe because there semiconductor team is talented and can tie the A6X+PowerVR GPU efficiently. NIVIDA should have focused more on GPU in my opinion as the CPU was already good enough. With these tablets pushing excess of 250+ppi the graphics processor will play a huge role. They put 72 cores in there processor. Excellent. Will the chip ever be optimized to full potential? No. So again they demonstrated a product that sounds good on paper but real world performance might be a different story.
MrPhilo said:
For goodness sake, this isn't final hardware, anything could change. Hung2900 knows nothing, what he stated isn't true. Samsung has licensed PowerVR, it isn't just stuck to Apple, just that Samsung prefers using ARMs GPU solution. Another thing I dislike is how everyone is comparing a GPU in the iPad 4 (SGX554MP4) that will NEVER arrive in a phone compared a Tegra 4 which will arrive in a phone. If you check OP link the benchmark was posted on the 3rd of January with different results (18fps then 33fps), so there is a chance it'll rival the iPad 4. I love Tegra as Nvidia is pushing developers to make more better games for Android compared to the 'geeks' *cough* who prefers benchmark results, whats the point of having a powerful GPU if the OEM isn't pushing developers to create enhance effect games for there chip.
Hamdir is correct about the GPUs, if Tegra 3 was around 50-80% faster than Tegra 2 with just 4 more cores, I can't really imagine it only being 2x faster than Tegra 3. Plus its a 28nm (at around 80mm2 just a bit bigger than Tegra 3, smaller than A6 90mm2) along with the dual memory than single on Tegra 2/3.
Click to expand...
Click to collapse
Firstly please keep it civil, don't go around saying that people know nothing, people's posts always speak volumes. Also calling people geeks, on XDA is that even an insult, next you're be asking what I deadlift:laugh:
My OP was done in the spirit of technical curiosity, and to counter the typical unrealistic expectations of a new product on mainstream sites, e.g. Nvidia will use Kepler tech (which was false), omg Kepler is like GTX 680, Tegra 4 will own the world, people forget that we are still talking about device that can only use a few watts, and must be passively cooled and not a 200+ watt, dual-fan GPU, even though they both now have to power similar resolutions, which is mental.
I both agree and disagree with your view on Nvidia's developer relationship, THD games do look nice, I compared Infinity Blade 2 on iOS vs Dead Trigger 2 on youtube, and Dead Trigger 2 just looked richer, more particle & physics effects, although IF Blade looked sharper at iPad 4 native resolution, one of the few titles to use the A6x's GPU fully.The downside to this relationship is the further fragmentation of the Android ecosystem, as Chainfire's app showed most of the extra effects can run on non Tegra devices.
Now, a 6 times increase in shader, does not automatically mean that games / benchmarks will scale in linear fashion, as other factors such as TMU /ROP throughput can bottleneck performance. Nvidia's Technical Marketing Manager, when interviewed at CES, said that the overall improvement in games / benchmarks will be around 3 to 4 times T3. Ultimately I hope to see Tegra 4 in a new Nexus 7, and if these benchmarks are proved accurate, it wouldn't stop me buying. Overall including the CPU, it would be a massive upgrade over the current N7, all in the space of a year.
At 50 seconds onwards.
https://www.youtube.com/watch?v=iC7A5AmTPi0
iOSecure said:
Typical "isheep" reference, unnecessary.
Why does apple have the advantage? Maybe because there semiconductor team is talented and can tie the A6X+PowerVR GPU efficiently. NIVIDA should have focused more on GPU in my opinion as the CPU was already good enough. With these tablets pushing excess of 250+ppi the graphics processor will play a huge role. They put 72 cores in there processor. Excellent. Will the chip ever be optimized to full potential? No. So again they demonstrated a product that sounds good on paper but real world performance might be a different story.
Click to expand...
Click to collapse
Sorry Steve, this is an Android forum, or where you too busy buffing the scratches out of your iPhone 5 to notice? I have full respect for the talents of Apple's engineers & marketing department, many of its users less so.
hamdir said:
tegra optimization porting no longer works using chainfire, this is now a myth
did u manage to try shadowgun thd, zombie driver or horn? the answer is no, games that use t3 sdk for physx and other cpu graphics works can not be forced to work on other devices, equally chainfire is now outdated and no longer updated
Click to expand...
Click to collapse
Looks like they haven't updated chain fire 3d for a while as a result only t3 games don't work but others do work rip tide gp, dead trigger etc . It's not a myth but it is outdated and only works with ics and tegra 2 compatible games . I think I (might be) unfortunate too but some gameloft games lagged on tegra device that i had, though root solved it too an extent
I am not saying something is superior to something just that my personal experience I might be wrong I may not be
Tbh I think benchmarks don't matter much unless you see some difference in real world usage and I had that problem with tegra in my experience
But we will have to see if the final version is able to push it above Mali t 604 and more importantly sgx544
Turbotab said:
Sorry Steve, this is an Android forum, or where you too busy buffing the scratches out of your iPhone 5 to notice? I have full respect for the talents of Apple's engineers & marketing department, many of its users less so.
Click to expand...
Click to collapse
No I actually own a nexus 4 and ipad mini so I'm pretty neutral in googles/apples ecosystems and not wiping any scratches off my devices.

Exynos 5 Octa and Snapdragon 800

Does anyone else think that the new-generation Exynos SoC will support 802.11ac and LTE-A? Or playing back 1080p video at 60 fps and 2k quality at 30 fps? These are features which were never really discussed about the chipset itself.
The Snapdragon 800 was confirmed to have compatibility and capability of all of the aforementioned. It sounds as if the Snapdragon 800 series will be the superior chipset, while the Exynos Octa will likely provide better power efficiency in some regard. It would be pretty disappointing if the Galaxy S IV got stuck with a Snapdragon 600 processor, given the date it's likely going to be pushed out on. It might make me consider the Note this time around.
i really hope all these rumors are fake, samsung should use Exynos on there flagship Galaxy S line ! if not the octa, maybe the Exynos 5 Quad Core 1.8-2.0GHz !
All the Snapdragon 600 happens to be is a mid-tear SoC, which improves upon the same GPU and performance of the S4 Pro. Real A15 architectures should blow this chipset out of the water. People seem to think that what they see now is good. But when the Snapdragon 800 and other A15-based chips start making their debut, this will feel dated quickly in the coming months.
megagodx said:
All the Snapdragon 600 happens to be is a mid-tear SoC, which improves upon the same GPU and performance of the S4 Pro. Real A15 architectures should blow this chipset out of the water. People seem to think that what they see now is good. But when the Snapdragon 800 and other A15-based chips start making their debut, this will feel dated quickly in the coming months.
Click to expand...
Click to collapse
clock-per-clock a15 is just 15% faster than krait, dont think that there's so much differences between the two.
they are both really solid performers and the batle is all on the maximum clock/power required rateo.
The SD800 will also feature Quick Charge 2.0, which is supposed to charge your battery 75% faster than other SoC chipsets without that same function. SD600 doesn't feature that either. I'm pretty sure if you seen the initial Tegra 4 benchmarks (based off of real A15 architecture) - they wipe the floor with the HTC One's SD600. Being 75% increased in performance over the Snapdragon S4 Pro (last year's best mobile SoC), the SD800 should bring comparatively the same or better results than the T4 mentioned. That's kind of going to be a disappointment if the S IV ends up with a SD600 and no Exynos 5 Quad/Octa, at least.

PCMark: Note3 out-performs Note4

See benchmark details here
Top scores....
Note 3: 5130
Note 4: 4942
Duh...
Quad-core 1.3 GHz Cortex-A53 & Quad-core 1.9 GHz Cortex-A57 (SM-N910C)
quad-core 1.3 GHz Cortex-A7 & Quad-core 1.9 GHz Cortex-A15 (N9000)
The Exynos CPU in the N3 and N4 hava exactly the same speed... And yet the N9005 only has a 1920x1080 screen, whereas the Note 4 has to render 2560x1440.
Thank you for proving why I absolutely hate Exynos.
I'd like to know the Snapdragon variants. Since the Note 4 does have a significantly more powerful Snapdragon CPU, and the Snapdragon is the 80% of the market model, the Exynos is only for lower markets.
Quad-core 2.7 GHz Krait 450 (SM-N910S)
Quad-core 2.3 GHz Krait 400 (N9005)
But the Exynos in the Note 4 is pretty awesome already:
http://anandtech.com/show/8718/the-samsung-galaxy-note-4-exynos-review
Good things are to come with the one in the Galaxy S6.
If you would run the PCMark test yourselves and post the results, that would be great!!
Thanks
ShadowLea said:
the Exynos is only for lower markets.
Click to expand...
Click to collapse
This is kind of offending!, and I am non emotional guy who hates Exynos too :|
devilsdouble said:
This is kind of offending!, and I am non emotional guy who hates Exynos too :|
Click to expand...
Click to collapse
If a market sells less or requires less high-level hardware due to an older, less sophisticated network system, it's considered a lower market. The demand and proceeds are lower compared to the high-selling markets, thus the word lower.
That's not a personal attempt at insult, it's a corporate definition.
Until 4G was rolled out, the Netherlands was one of those lower markets. (Though, frankly, I still consider it as such..) In the days of the S3, every non-US country was considered a lower market.
(Besides, I'm a sociopath, I don't do emotional )
Marketing aside: Temasek's CM12 + arter97 kernel + data&cache partitions in f2fs.
The phone is superfast as hell, but benchmark result was this:
Times are changing, for the worse and for better, i know it makes no sense, but so doesnt sammy.
They seem to drop Snapdragon, and with 810 in sight (ignored too), Exynos is going for a PR fight with overheating accusations, and being the sucky ones in performance and the best in sales (Samsung generally), they just made their phones even less open to the people, HOWEVER...they are dropping bloat too.
As i said, they are making no sense.
sirobelec said:
Marketing aside: Temasek's CM12 + arter97 kernel + data&cache partitions in f2fs.
The phone is superfast as hell, but benchmark result was this:
Click to expand...
Click to collapse
Stock Note N900 seems to perform better
PCMark for Android claims to......
Measure the performance and battery life of your Android smart phone and tablet using tests based on everyday tasks, not abstract algorithms.
ShadowLea said:
If a market sells less or requires less high-level hardware due to an older, less sophisticated network system, it's considered a lower market. The demand and proceeds are lower compared to the high-selling markets, thus the word lower.
That's not a personal attempt at insult, it's a corporate definition.
Until 4G was rolled out, the Netherlands was one of those lower markets. (Though, frankly, I still consider it as such..) In the days of the S3, every non-US country was considered a lower market.
(Besides, I'm a sociopath, I don't do emotional )
Click to expand...
Click to collapse
ShadowLea said:
Duh...
Quad-core 1.3 GHz Cortex-A53 & Quad-core 1.9 GHz Cortex-A57 (SM-N910C)
quad-core 1.3 GHz Cortex-A7 & Quad-core 1.9 GHz Cortex-A15 (N9000)
The Exynos CPU in the N3 and N4 hava exactly the same speed... And yet the N9005 only has a 1920x1080 screen, whereas the Note 4 has to render 2560x1440.
Thank you for proving why I absolutely hate Exynos.
I'd like to know the Snapdragon variants. Since the Note 4 does have a significantly more powerful Snapdragon CPU, and the Snapdragon is the 80% of the market model, the Exynos is only for lower markets.
Quad-core 2.7 GHz Krait 450 (SM-N910S)
Quad-core 2.3 GHz Krait 400 (N9005)
Click to expand...
Click to collapse
Why don't you simply run the test yourself with the superior phone/network you have and let the results speak for themselves?
PCMark for android
4354 here UK note 3
If Samsung do end up dropping Qualcomm in their next generation of phones, my N9005 Note 3 will be my last Samsung for the foreseeable future. Exynos holds no interest for me, as it's closed source nature inevitably means little to no support for non-stock AOSP/CM roms. And the non-stock roms that are available are generally unstable and bug ridden.
^ +100
We know S6 is not going to have S810, why wouldnt they follow the same path with Note's too?
SM-N9005 is my last Samsung device, i am not going to drag myself to pain with Exynos.
New top score... 5130
Benchmark scores between flagship phones mean precisely jack s**t these days, they're little more than **** waving. Discernible features is what should be compared.
"Wow, my Android phone scored 200 more points than your Android phone! And please, let's ignore the fact it will make precisely zero difference in real world use!"
Beefheart said:
Benchmark scores between flagship phones mean precisely jack s**t these days, they're little more than **** waving. Discernible features is what should be compared.
"Wow, my Android phone scored 200 more points than your Android phone! And please, let's ignore the fact it will make precisely zero difference in real world use!"
Click to expand...
Click to collapse
Ignorance is bliss!!
The whole point of these tests is to show that most of the other benchmarks don't show a true picture of real-life use.
Why else would Note 3 appear to perform better than Note4?
The PCMark webpage states the following...
PCMark for Android introduces a fresh approach to benchmarking smart phones and tablets. It measures the performance and battery life of the device as a complete unit rather than a set of isolated components. And its tests are based on common, everyday tasks instead of abstract algorithms.
Click to expand...
Click to collapse
Yeah, that completely changed my opinion.*
* may contain sarcasm.
Beefheart said:
Yeah, that completely changed my opinion.*
* may contain sarcasm.
Click to expand...
Click to collapse
Have it your way... at least , I, am actually investigating
It's in the interest of the benchmark app developers for users to believe their offerings aren't pointless.
Beefheart said:
It's in the interest of the benchmark app developers for users to believe their offerings aren't pointless.
Click to expand...
Click to collapse
I agree with you on this....... generally.
I however found this particular benchmark interesting for the following reasons....
1. It proves software is the biggest bottleneck in android phones, not hardware. ( Lollipop on Note3 >>beats>> kitkat onNote4 )
2. It proved that my Note 3 performs better in everyday use than my Note4 ( This I have always known but no benchmark showed it.)

Categories

Resources