For giggles, can one of you that's stock run the Electopia benchmark? There's been some interesting results and it would be cool to see how another dual-core phone with a different CPU/GPU performs. The Sensation folks are obviously not amused.
Sensation
800x480
Average FPS: 23.65
Time: 60
Number of Frames: 1419
Trianglecount: 48976
Peak Trianglecount: 68154
960x540
Average FPS: 19.90
Time: 60.01
Number of Frames: 1194
Trianglecount: 49415
Peak Trianglecount: 67076
SGS2
Average FPS: 37.58
Time: 60.01
Number of frames: 2255
Trianglecount: 48633
Peak trianglecount: 68860
DHD
Average FPS: 23.36
Time: 60.03
Number of frames: 1402
Trianglecount: 48835
Peak trianglecount: 67628
Even the Desire HD blew away my G2x on this benchmark but it could be the custom ROM... I'll switch back to AOSP and try it again.
16FPS
Can't be right, my Thunderbolt smoked my g2x
26 FPS Thunderbolt vs 16FPS G2x
Something is very wrong with those numbers if this is supposed to be measuring opengl 2.0
I have stock and with a really hard time getting it to respond to touch input and with the sound off here are the scores:
Average FPS - 15.56
Time - 60.04
Number of Frames - 934
Trianglecount - 48928
Peak Trianglecount - 68838
This was a super buggy program on the G2x. I think it is definitely not optimized for dual core or at least the Tegra 2 architecture.
Sent from my T-Mobile G2x using XDA App
There's no way the G2X would be lower than the Sensation. The test probably isn't dual-core optimized and the new CPU/GPUs are throwing it off. Thanks for trying though.
BarryH_GEG said:
There's no way the G2X would be lower than the Sensation. The test probably isn't dual-core optimized and the new CPU/GPUs are throwing it off. Thanks for trying though.
Click to expand...
Click to collapse
And there is no way the g2x could be lower than a single core adreno 205 Thunderbolt.
15.57 FPS for me running stock/not rooted. Like previously mentioned, it was very unresponsive to touch.
Badly designed benchmark programs are bad.
diablos991 said:
Badly designed benchmark programs are bad.
Click to expand...
Click to collapse
The sad part is that this isn't just a benchmark - its a game first and foremost.
And yeah I can't get past 16FPS on stock speed OR at 1.5GHz so I think there's definitely coding issues as Nenamark using Trinity on Bionic scores 72FPS. I think my Inspire (Adreno 205) got about 35?
+1
Lets all buy phones with top benchmarks!!!!!!
Better yet lets all get iPhones.....
Fu*k a benchmark
Sent from my LG-P999 using XDA Premium App
But if you really stop and think about it, if each of the different CPU/GPU's behave differently running the same software because of proprietary hardware performance tweaks we'll all be screwed in the long run. No matter how Electopia was written, one would think it would behave the same way on different CPU/GPU combinations - even if it wasn't dual-core optimized. So developers are either going to have to start testing on every CPU/GPU combo to release a single version of an app or release different apps for different CPU/GPUs. It's way too early to tell as dual-core and 2.3ish isn't that common now but it should be interesting watching software performance and development play out in the future.
BarryH_GEG said:
But if you really stop and think about it, if each of the different CPU/GPU's behave differently running the same software because of proprietary hardware performance tweaks we'll all be screwed in the long run. No matter how Electopia was written, one would think it would behave the same way on different CPU/GPU combinations - even if it wasn't dual-core optimized. So developers are either going to have to start testing on every CPU/GPU combo to release a single version of an app or release different apps for different CPU/GPUs. It's way too early to tell as dual-core and 2.3ish isn't that common now but it should be interesting watching software performance and development play out in the future.
Click to expand...
Click to collapse
Its piss-poor coding on the app developer's part - plain and simple. While there are Tegra 2-specific instructions that an app developer can use in their application, there are not any mobile OpenGL 2.0 instructions the Tegra 2 doesn't support as far as I am aware.
If you want a good challenge for the chip, download an3dbench XL from Market. I just scored 32640 and that's with a bunch of background apps.
Isn't this a windows mobile port (had it on my HD2 running WM6.5)? So, how does it provide an accurate representation of gaming on an Android device? Since it is the only bench my G2x has scored poorly on and (more importantly) real world gaming is spectacular on this thing, I'm going to say it doesn't. I wouldn't put a whole lot of stock in this one...
Yeah agreed. I just ran it on the Nexus/CM7 AOSP hybrid and it still was only 16.06 while I got almost 40,000 on an3dbenchXL which put me like 30-something out of 7000ish results.
This application was influenced by Qualcomm specifically to run poorly on Tegra 2 devices. They messed with the shaders so everything is rendered at a weird angle. If you change the code to run with a normal approach, you see the same results on Qualcomm chips but also 3-5x perf on NVIDIA chips
why would you say this benchmark was influenced? if you have the sources ..please share .. so we can all look ... and how can you say BenchXL is a good benchmark? I have run BenchXL Benchmark and seen un matching results on many forums ... it is very unreliable... not a good benchmark. At least electopia gives consistent reliable results... I would go with electopia as a GPU benchmark ..
i have a xperia play for myself - which performs superb for gaming - awesome graphics - i love the games on it - awesome device. my wife has g2x - which is equally good for gaming (thought she just uses it for texting - LOL )....
i think for gaming both xperia play and g2x are good...
I'd hardly say it's biased to any one specific manufacturer based on these benchmarks:
More so I ran it myself with the latest firmware at stock frequencies (SGS2 btw ) and got:
Average FPS: 51.44
Time: 60.02
Number of frames: 3087
Trianglecount: 48827
Peak trianglecount: 68868
Quite funny difference to any other device I might say.
It's not biased towards any manufacturer, it is biased against NVIDIA's ULP GeForce GPUs in Tegra 2 SOCs.
Changes to the code cause increases in performance on Tegra 2 devices, while results on other platforms do not change.
In general, there is never a single, all-encompassing GPU benchmark to accurately compare devices. It all depends on the code, and how it interacts with the specific hardware of the device.
images |DOT| anandtech |DOT| com /graphs/graph4177/35412.png
images |DOT| anandtech |DOT| com /graphs/graph4177/35412.png
Source: Anandtech Samsung Galaxy S2 review (I can't post links )
http://images.anandtech.com/graphs/graph4177/35412.png
That AnandTech review is badly outdated, like I said; the SGS2 gets for example 16fps there in February. I myself get 58fps today.
And I don't think it's biased against Tegra. Tegra performs pretty much there where it should be considering its age, and corresponds to it's specs.
And just to prove dismiss your point that Tegra gets a different codepath, I ran Electopia Bench again via Chainfire3D using the NVIDIA GL wrapper plugin emulating said device and I'm still getting the same amount of FPS.
If what you're saying is that it's not utilizing Tegra's full potential through proprietary Nvidia OpenGL extensions, might as well pack the bag and leave because then that logic would apply to pretty much every graphics core since it's not optimized for it. What we see here in these benchmarks is a plain simple ES 2.0 codepath which all devices should support and so we can do an oranges to oranges comparision. It's also one of the heaviest fragment-shader dependent benchmarks out there for the moment, and less geometry and texture bound, and that's why it runs so badly on pretty much every chip, since they don't get this type of workload in other benchmarks. This is also why the Mali gets such high FPS as that's where the quad GPU setup in the Exynos can shine.
AndreiLux said:
I'd hardly say it's biased to any one specific manufacturer based on these benchmarks:
More so I ran it myself with the latest firmware at stock frequencies (SGS2 btw ) and got:
Average FPS: 51.44
Time: 60.02
Number of frames: 3087
Trianglecount: 48827
Peak trianglecount: 68868
Quite funny difference to any other device I might say.
Click to expand...
Click to collapse
It's clearly MALI 400 in SGS2 is most powerful GPU right now. There is a 60fps limit on Galaxy S2, so you'll need a powerful benchmark. You can also see that in Nenamark2 too. SGS2=47fps, G2X=28fps, SGS=24fps
Related
So we all know the Nexus S has a 1Ghz Cortex A8 Hummingbird CPU, which sounds unimpressive considering the Nexus One has a 1Ghz Snapdragon QSD 8250, but it's a known fact that clock speed often has little to do with actual computational power. Qualitative previews have said that the Nexus S "flies," but I'd like to see something more in the numbers. If anyone has a demo device, could you run a few benchmarks? Or perhaps comment on performance after quick opening/closing several computationally intensive applications?
QuacoreZX said:
So we all know the Nexus S has a 1Ghz Cortex A8 Hummingbird CPU, which sounds unimpressive considering the Nexus One has a 1Ghz Snapdragon QSD 8250, but it's a known fact that clock speed often has little to do with actual computational power. Qualitative previews have said that the Nexus S "flies," but I'd like to see something more in the numbers. If anyone has a demo device, could you run a few benchmarks? Or perhaps comment on performance after quick opening/closing several computationally intensive applications?
Click to expand...
Click to collapse
1gb humingbird is fast as galaxy S and iphone 4. both which are like 30% or more faster then snapdragon
I think the key improvement is in graphics performance. Here is a comparison.
QuacoreZX said:
So we all know the Nexus S has a 1Ghz Cortex A8 Hummingbird CPU, which sounds unimpressive considering the Nexus One has a 1Ghz Snapdragon QSD 8250, but it's a known fact that clock speed often has little to do with actual computational power. Qualitative previews have said that the Nexus S "flies," but I'd like to see something more in the numbers. If anyone has a demo device, could you run a few benchmarks? Or perhaps comment on performance after quick opening/closing several computationally intensive applications?
Click to expand...
Click to collapse
The S5PC11x (Hummingbird) has 2x the memory bandwidth of the MSM8250.
The MSM8250 gets about 2x the floating point performance of the S5PC11x.
I believe the SGX540 GPU in S5PC11x is on the whole a bit faster than the GPU in the 8250, but I don't have hard numbers on that in front of me. They're architecturally different GPUs and will have different strengths and weaknesses.
It's really hard to do a good apples to apples comparison of different SoCs -- memory interconnect, cache sizes, ARM architecture version, GPU, etc, etc all play into overall system performance.
Gingerbread, overall, tends to be faster than Froyo on the same hardware.
Not really too familiar with this stuff, but will the JIT compiler being optimized for snapdragon instruction set make a huge difference still? My Vibrant plays games way better than the MT4G (imo) but scores terribly on Linpack and is terribly slow at opening applications and things vs. the MT4G.
Read the post above you. Linpack is mainly a benchmark for numerical performance(floating point etc), where the Snapdragon chips are MUCH better.
But the Hummingbird(PowerVR) GPU is better than the Adreno GPU found in the Snapdragon line. That's why the gaming performance of your Vibrant is better than the MT4G.
Ronaldo_9 said:
1gb humingbird is fast as galaxy S and iphone 4. both which are like 30% or more faster then snapdragon
Click to expand...
Click to collapse
PhoenixFx said:
I think the key improvement is in graphics performance. Here is a comparison.
Click to expand...
Click to collapse
Yup, just anecdotally, hummingbird is MUCH faster than snapdragon IMHO
galaxyS/NS SGX540= 90 million triangles/sec
HTC G2 Adreno 205 =44 million triangles/sec
Nexus one = Adreno 200 = 22 million triangles/sec
nexus S is running on the fastest GPU out now. And another good thing about running on power VR GPU is that iphone runs on one also so when lazy iphone porting happens you will have optimal performance running on that GPU than you would on Adreno
Ive noticed this especially on gameloft games
Trust me im on a vibrant and came from nexus one with out a doubt the nexus S GPU smokes nexus one GPU even out performance 2nd gen snapdragon
Hummingbird > all atm.
Orion will be the same.
Don't make pre-assumptions about the dual core chips.. Orion has good competition from the TI OMAPS line.. Qualcomm looks like they'll stay behind GPU wise though.
Plus the Sound Quality of the Hummingbird chip is awesome. MUCH better than the Snapdragon chips.
Also, you have to be cautious of manufacturer specs for GPU pixels/sec and triangles/sec -- the "box numbers" are always under optimal conditions and often not representative of real workloads.
For modern non-fixed-pipe GPUs (gl ES 2.x, etc) compute capabilities (how many shader ops / pixel/ etc you can get away with) factor in as well.
Depending on what your workload is like (geometry heavy? fill heavy? texture heavy? shader heavy?) you will see different strengths and weaknesses when comparing GPUs.
All that said, the SGX540 is indeed quite snappy.
chip
I agree the sound chip is good in the NS, as is the GPU
http://nena.se/nenamark/view?version=2
http://www.gsmarena.com/htc_sensation-review-605p4.php
nraudigy2 said:
http://nena.se/nenamark/view?version=2
Click to expand...
Click to collapse
who cares? SGSII stills faster... and G2X it's just 5FPS under...
tomeu0000 said:
who cares? SGSII stills faster... and G2X it's just 5FPS under...
Click to expand...
Click to collapse
Talk about troll
tomeu0000 said:
who cares? SGSII stills faster... and G2X it's just 5FPS under...
Click to expand...
Click to collapse
Who cares? All of our phones will be obsolete by the end of the year anyways
Sent from my HTC Sensation 4G using XDA App
tomeu0000 said:
who cares? SGSII stills faster... and G2X it's just 5FPS under...
Click to expand...
Click to collapse
SGS II is faster due to the lower resolution. learn the facts before commenting.
xamadeix said:
SGS II is faster due to the lower resolution. learn the facts before commenting.
Click to expand...
Click to collapse
Nope, the resolution isnt the finally factor, ( Tegra 2 is powerfull than adreno220 in benchmarks, but Atrix with qHD resolution scores like the sensation, so Adreno220 isnt more powerfull ) just watch CF-Bench, Vellamo bench and other bench, SGSII still superior, in CPU and GPU.
And that % more resolution, will take about 10FPS, max 15 FPS so if at 800x480 Adreno220 stills not more powerfull.
i have a sensation, but for now SGSII is more powerfull.
With optimization maybe, but on default definetly NOT.
Excuse my bad english.
tomeu0000 said:
Nope, the resolution isnt the finally factor, ( Tegra 2 is powerfull than adreno220 in benchmarks, but Atrix with qHD resolution scores like the sensation, so Adreno220 isnt more powerfull ) just watch CF-Bench, Vellamo bench and other bench, SGSII still superior, in CPU and GPU.
And that % more resolution, will take about 10FPS, max 15 FPS so if at 800x480 Adreno220 stills not more powerfull.
i have a sensation, but for now SGSII is more powerfull.
With optimization maybe, but on default definetly NOT.
Excuse my bad english.
Click to expand...
Click to collapse
Adreno 220 is much better than Ad 205..But sometimes even my dhd is MUCH faster than Sensation..I believe it is the optimization's difference..With the optimization we can have ad 220's best performance..I believe at that time ad 220 will be better than optimized SG2
missing2 said:
Who cares? All of our phones will be obsolete by the end of the year anyways
Sent from my HTC Sensation 4G using XDA App
Click to expand...
Click to collapse
True, 6 months from now it will be quad core phones, and really, do you care if it takes you 1.275ms longer to type in a phone number on one phone over another?
Seriously guys, get a frikin life, you buy the phone you prefer, everyone's preference is different.... and that's that.
Think of it like this.. A girl will go out with the guy she prefers. Highly unlikely that she will get you to flop it out and make a decision on the millimeter difference here and there.
Moreover, she won't be arguing with other girls on a forum about it either.
.... GET. OVER. IT.
GET. A. LIFE.
Sent from my HTC Sensation Z710a (S-ON GRRRR!) using XDA Premium App
This pretty much sums it up...
http://www.gsmarena.com/htc_sensation-review-605p4.php
artymarty said:
True, 6 months from now it will be quad core phones, and really, do you care if it takes you 1.275ms longer to type in a phone number on one phone over another?
Seriously guys, get a frikin life, you buy the phone you prefer, everyone's preference is different.... and that's that.
Think of it like this.. A girl will go out with the guy she prefers. Highly unlikely that she will get you to flop it out and make a decision on the millimeter difference here and there.
Moreover, she won't be arguing with other girls on a forum about it either.
.... GET. OVER. IT.
GET. A. LIFE.
Sent from my HTC Sensation Z710a (S-ON GRRRR!) using XDA Premium App
Click to expand...
Click to collapse
Who would want a quadcore phone? @[email protected] I mean, no application in a mobile requires that kind of processor. even a 800mhz processor can process most of the apps now. and besides, who would think of developing an app that would require quadcore? @[email protected]
I'm excited for our phones to be cracked open. I think that is when we will really start to see what they can do. Numbers well dramatically increase.
Can't wait!
Sent from my HTC Sensation 4G using XDA App
vitusdoom said:
Who would want a quadcore phone? @[email protected] I mean, no application in a mobile requires that kind of processor. even a 800mhz processor can process most of the apps now. and besides, who would think of developing an app that would require quadcore? @[email protected]
Click to expand...
Click to collapse
Some people would still buy it even if it is overkill. I can't imagine why quad core would be needed in a phone but I think it doesn't stop there.
brusko1972 said:
Some people would still buy it even if it is overkill. I can't imagine why quad core would be needed in a phone but I think it doesn't stop there.
Click to expand...
Click to collapse
For gaming purposes I suppose. 30% of all gaming takes place via smartphones so it's a ripe market for developers. Quadcore devices would pretty much put devices on par with console systems.
People would buy a quad core phone (such as I) the same reason why some people get sports car. Are sports car absolutely needed for everyday driving? Most of the time, I would highly doubt it, but it sure is nice as hell to have, no?
twomix9900 said:
People would buy a quad core phone (such as I) the same reason why some people get sports car. Are sports car absolutely needed for everyday driving? Most of the time, I would highly doubt it, but it sure is nice as hell to have, no?
Click to expand...
Click to collapse
Thats not the correct question. lol. you didn't get it.
question is, why would people buy a sports car when in the case he only knows how to drive a bike.
Well, surely, quadcores are great. and mentioned above, games needs it. looking at games today, most of them are not that resource consuming at all. just needs a decent graphic emulator. not processor. you definitely don't understand what a processor does. it only process the loading of a certain app. surely it does process during the game but you can measure the speed clearly during app loading. what does a game that loads up real fast but in the short run, it hangs up like hell? mind you guys, most of the games usually are just 10-25megabytes (most that i've seen) any single core processor can process that fast. should we say, its like 200mb of a game. single cores can process that. but when you say gaming, you should think about graphics first.
From what I have been reading... it will not only be quad core... but also we'll have speeds up to 2.5GHz. That's faster than my laptop
Too bad it loses in pretty much every other benchmark.
GS2 is teh suck, gets crushed in smartbench gaming...
But it's the fastest phone out there....
KingKuba13 said:
Too bad it loses in pretty much every other benchmark.
GS2 is teh suck, gets crushed in smartbench gaming...
But it's the fastest phone out there....
Click to expand...
Click to collapse
I don't think most beanchmarks are utilizing these dual core CPU's properly. That goes for all of them. Not just the Sensations. I wouldn't trust any of these benches with dual core CPU's.
Sent from my HTC Sensation 4G using XDA Premium App
KingKuba13 said:
Too bad it loses in pretty much every other benchmark.
GS2 is teh suck, gets crushed in smartbench gaming...
But it's the fastest phone out there....
Click to expand...
Click to collapse
Smartbench is weak stuff. Any 3D scene that is too weak will lower the score of GS2. For example it could do 300fps for Neocore benchmark app, but it has a 60fps limit so the app only reports 59fps for GS2, while another phone scores 80fps and yet GS2 has lower score. You just have to bench GS2 on strong benchmark apps like Nenamark2 and GLBenchmark2.
Understanding the current generation SoC and benchmark:
SoC stands for System on a Chip. But most of us care only about the CPU and GPU on it.
Snapdragon (all 3 iterations) used the same Scorpion CPU core, at different clockspeed. The one on the Sensation has two cores, both can run up to 1.2Ghz, so if a benchmark is single threaded and very CPU heavy, the latest Snapdragon can only be 20% faster than the first generation 1Ghz Snapdragon.
Qualcomm uses a custom design for the Scorpion. Roughly speaking, the performance of the Scorpion lies somewhere between Cortex A8 and A9. In general, SoC with dual core Cortex A9's like Exynos, Tegra 2, OMAP4 will be faster in CPU heavy apps and benchmark. Yet, the Scorpion is exceptionally good at FPU heavy task, so... if FPU matter for that app/benchmark, Scorpion could pull over.
GPU wise, this depends on resolution. Higher resolution means more pixel to generate and lower benchmark score, OTHER THINGS EQUAL. The GPU on the dual-core Snapdragon is as powerful as those on Exynos and OMAP4, with one winning in some benchmark and another winning in another. Due to different resolutions on different handsets, it's hard to tell, but they are among the same class. The Tegra 2, however, has a weaker GPU than the bunch mentioned above. This may come at a surprise to everyone consider Nvidia is a graphic card company and the chip is often being promoted as "most powerful". The truth is, the Tegra 2 was supposed to be released in mid 2010 but the market wasn't ready for dual-core phones back then. So the Tegra 2 got delayed for a year, and the design of Tegra 2 was set early. But that's also why Nvidia is almost ready to launch Kal-El/Tegra 3 whatever the next thing is, because the design of Tegra 2 was done long time ago.
So if a benchmark is graphically intensive, and doesn't depending too much on CPU, Snapdragon will be faster than Tegra 2, while Exynos will be the fastest (especially since there is no qHD Exynos device out there yet). On FPU heavy CPU bench, like Linpack, Snapdragon perform exceptionally well due to its CPU design. But with benchmarks that test a wider variety of CPU function, Cortex A9 equipped SoC will beat Snapdragon. And while Tegra 2 has a weaker GPU, it may perform better in some games..... because of Nvidia's "the way it meant to be played" program. Basically it's Nvidia way to fund developers to optimize the code for Nvidia's chips, and market their games. It is no uncommon to see games that are funded by Nvidia's TWIMTBP program run faster on Nvidia's card than on AMD's card.
But what does all the above mean? IT DOESN'T F***ING MATTER AT ALL. All the current dual-core SoCs are fast enough for everything you want to do on your phone. They are equally (not) future proof, and when the future comes that your current phone is too slow, the other current gen phones will be slow too. And honestly, these ARM based SoCs are evolving so fast that none of these SoCs is really future proof. Just pick the phone that feels right or you. IGNORE those stupid benchmark numbers, and pick the phone that physically appeal to you, and pick the phone that is less buggy, or has the best monitor (for you). And if you really care about benchmark numbers, get the GSII. It has the fastest ARM-based CPU right now, one of the fastest mobile GPUs, and a relatively lower resolution screen so that it dominates all benchmarks. It also has enough plastic to be a true successor to the GS I as the most plasticky Android phone, if that matters.
For all those interested in developing for the Exynos 5250, to be used in the Nexus 10, Samsung have kindly launched, for a modest sum, the Arndale development board.
http://www.arndaleboard.org/wiki/index.php/Main_Page
It has already been benchmarked on the GL Benchmark site, Mali T-604 is powerful, but it doesn't look like it will give the A6X any headaches.
http://www.glbenchmark.com/phonedet...o25&D=Samsung+Arndale+Board&testgroup=overall
Need proper benchmark done on a final device. Definitely can't think that dev board drivers are optimized properly. It's running on 4.0.4. We should get more details once we do a benchmark on a final version of N10.
hot_spare said:
Need proper benchmark done on a final device. Definitely can't think that dev board drivers are optimized properly. It's running on 4.0.4. We should get more details once we do a benchmark on a final version of N10.
Click to expand...
Click to collapse
Jelly Bean didn't do much for graphics benchmarks, IRC. The low-level test won't change much, if you do a comparison with the iPad 3, you can see that Power VR SGX MP4 is a beast in terms of pixel / texture fill rate, which the A6X will improve further. The consensus is that shader power is the most important, as long as there is sufficient fill rate performance, and the Mali T-604 combined with its good bandwidth should be as capable as the A6X in real world games, the only question will developers optimise a game just 1 tablet?
Turbotab said:
Jelly Bean didn't do much for graphics benchmarks, IRC. The low-level test won't change much, if you do a comparison with the iPad 3, you can see that Power VR SGX MP4 is a beast in terms of pixel / texture fill rate, which the A6X will improve further. The consensus is that shader power is the most important, as long as there is sufficient fill rate performance, and the Mali T-604 combined with its good bandwidth should be as capable as the A6X in real world games, the only question will developers optimise a game just 1 tablet?
Click to expand...
Click to collapse
I am not saying that JB will suddenly improve GPU benchmarks, but a lot of improvement can happen due to driver/firmware optimization.
Let me give you real example: Do you recall GLbenchmark Egypt offscreen scores GS2 when it came out initially? It was getting around 40-42fps initially.
[Source: http://www.anandtech.com/show/4686/samsung-galaxy-s-2-international-review-the-best-redefined/17 ]
The same GS2 after a few months was getting 60-65fps under same test.
Source 1: http://www.anandtech.com/show/6022/samsung-galaxy-s-iii-review-att-and-tmobile-usa-variants/4
Source 2: http://www.anandtech.com/show/5811/samsung-galaxy-s-iii-preview
It's a clear 50% improvement in performance done primarily through driver optimization.
Also, check the fill rate properly in the Arndale board test. It's much less than what is expected. ARM says that Mali-T604 clocked at 500MHz should get a fill rate of 2 GPixels/s. It's actually showing just about 60% of what it should be delivering.
http://blogs.arm.com/multimedia/353-of-philosophy-and-when-is-a-pixel-not-a-pixel/
Also check this slide : http://semiaccurate.com/assets/uploads/2012/03/Samsung_Exynos_5_Mali.jpg
Samsung says 2.1 GPixels/s @ GPU clocked at 533MHz. Obviously the results don't match with quoted numbers. Difference is a lot actually.
I believe the final Nexus 10 numbers will be quite different from what we see now. Let's wait for final production models.
Who has been excited by the Tegra 4 rumours?, last night's Nvidia CES announcement was good, but what we really want are cold-hard BENCHMARKS.
I found an interesting mention of Tegra T114 SoC on a Linux Kernel site, which I've never heard of. I got really interested when it stated that the SoC is based on ARM A15 MP, it must be Tegra 4. I checked the background of the person who posted the kernel patch, he is a senior Nvidia Kernel engineer based in Finland.
https://lkml.org/lkml/2012/12/20/99
"This patchset adds initial support for the NVIDIA's new Tegra 114
SoC (T114) based on the ARM Cortex-A15 MP. It has the minimal support
to allow the kernel to boot up into shell console. This can be used as
a basis for adding other device drivers for this SoC. Currently there
are 2 evaluation boards available, "Dalmore" and "Pluto"."
On the off chance I decided to search www.glbenchmark.com for the 2 board names, Dalmore (a tasty whisky!) and Pluto (Planet, Greek God and cartoon dog!) Pluto returned nothing, but Dalmore returned a device called 'Dalmore Dalmore' that was posted on 3rd January 2013. However the OP had already deleted them, but thanks to Google Cache I found the results
RESULTS
GL_VENDOR NVIDIA Corporation
GL_VERSION OpenGL ES 2.0 17.01235
GL_RENDERER NVIDIA Tegra
From System spec, It runs Android 4.2.1, a Min frequency of 51 MHz and Max of 1836 Mhz
Nvidia DALMORE
GLBenchmark 2.5 Egypt HD C24Z16 - Offscreen (1080p) : 32.6 fps
iPad 4
GLBenchmark 2.5 Egypt HD C24Z16 - Offscreen (1080p): 49.6 fps
CONCLUSION
Anandtech has posted that Tegra 4 doesn't use unified shaders, so it's not based on Kepler. I reckon that if Nvidia had a brand new GPU they would have shouted about it at CES, the results I've found indicate that Tegra 4 is between 1 to 3 times faster than Tegra 3.
BUT, this is not 100% guaranteed to be a Tegra 4 system, but the evidence is strong that it is a T4 development board. If this is correct, we have to figure that it is running beta drivers, Nexus 10 is ~ 10% faster than the Arndale dev board with the same Exynos 5250 SoC. Even if Tegra 4 gets better drivers, it seems like the SGX 544 MP4 in the A6X is still the faster GPU, with Tegra 4 and Mali T604 being an almost equal 2nd. Nvidia has said that T4 is faster than A6X, but the devil is in the detail, in CPU benchmarks I can see that being true, but not for graphics.
UPDATE - Just to add to the feeling that that this legit, the GLBenchmark - System section lists the "android.os.Build.USER" as buildbrain. Buildbrain according to a Nvidia job posting is "Buildbrain is a mission-critical, multi-tier distributed computing system that performs mobile builds and automated tests each day, enabling NVIDIA's high performance development teams across the globe to develop and deliver NVIDIA's mobile product line"
http://jobsearch.naukri.com/job-lis...INEER-Nvidia-Corporation--2-to-4-130812500024
I posted the webcache links to GLBenchmark pages below, if they disappear from cache, I've saved a copy of the webpages, which I can upload, Enjoy
GL BENCHMARK - High Level
http://webcache.googleusercontent.c...p?D=Dalmore+Dalmore+&cd=1&hl=en&ct=clnk&gl=uk
GL BENCHMARK - Low Level
http://webcache.googleusercontent.c...e&testgroup=lowlevel&cd=1&hl=en&ct=clnk&gl=uk
GL BENCHMARK - GL CONFIG
http://webcache.googleusercontent.c...Dalmore&testgroup=gl&cd=1&hl=en&ct=clnk&gl=uk
GL BENCHMARK - EGL CONFIG
http://webcache.googleusercontent.c...almore&testgroup=egl&cd=1&hl=en&ct=clnk&gl=uk
GL BENCHMARK - SYSTEM
http://webcache.googleusercontent.c...ore&testgroup=system&cd=1&hl=en&ct=clnk&gl=uk
OFFSCREEN RESULTS
http://webcache.googleusercontent.c...enchmark.com+dalmore&cd=4&hl=en&ct=clnk&gl=uk
http://www.anandtech.com/show/6550/...00-5th-core-is-a15-28nm-hpm-ue-category-3-lte
Is there any Gpu that could outperform iPad4 before iPad5 comes out? adreno 320, t Mali 604 now tegra 4 aren't near it. Qualcomm won't release anything till q4 I guess, and tegra 4 has released too only thing that is left is I guess is t Mali 658 coming with exynos 5450 (doubtfully when it would release, not sure it will be better )
Looks like apple will hold the crown in future too .
i9100g user said:
Is there any Gpu that could outperform iPad4 before iPad5 comes out? adreno 320, t Mali 604 now tegra 4 aren't near it. Qualcomm won't release anything till q4 I guess, and tegra 4 has released too only thing that is left is I guess is t Mali 658 coming with exynos 5450 (doubtfully when it would release, not sure it will be better )
Looks like apple will hold the crown in future too .
Click to expand...
Click to collapse
There was a great article on Anandtech that tested the power consumption of the Nexus 10's Exynos 5250 SoC, it showed that both the CPU and GPU had a TDP of 4W, making a theoretical SoC TDP of 8W. However when the GPU was being stressed by running a game, they ran a CPU benchmark in the background, the SoC quickly went up to 8W, but the CPU was quickly throttled from 1.7 GHz to just 800 Mhz as the system tried to keep everything at 4W or below, this explained why the Nexus 10 didn't benchmark as well as we wished.
Back to the 5450 which should beat the A6X, trouble is it has double the CPU & GPU cores of the 5250 and is clocked higher, even on a more advanced 28nm process, which will lower power consumption I feel that system will often be throttled because of power and maybe heat concerns, so it looks amazing on paper but may disappoint in reality, and a 5450 in smartphone is going to suffer even more.
So why does Apple have an advantage?, well basically money, for a start mapple fans will pay more for their devices, so they afford to design a big SoC and big batteries that may not be profitable to other companies. Tegra 4 is listed as a 80mm2 chip, iPhone 5 is 96mm2 and A6X is 123mm2, Apple can pack more transistors and reap the GPU performance lead, also they chosen graphics supplier Imagination Technologies have excellent products, Power VR Rogue will only increase Apple's GPU lead. They now have their own chip design team, the benefit for them has been their Swift core is almost as powerful as ARM A15, but seems less power hungry, anyway Apple seems to be happy running slower CPUs compared to Android. Until an Android or WP8 or somebody can achieve Apple's margins they will be able to 'buy' their way to GPU domination, as an Android fan it makes me sad:crying:
32fps is no go...lets hope it's not final
hamdir said:
32fps is no go...lets hope it's not final
Click to expand...
Click to collapse
It needs to, but it will be OK for a new Nexus 7
still faster enough for me, I dont game alot on my nexus 7.
I know I'm taking about phones here ... But the iPhone 5 GPU and adreno 320 are very closely matched
Sent from my Nexus 4 using Tapatalk 2
italia0101 said:
I know I'm taking about phones here ... But the iPhone 5 GPU and adreno 320 are very closely matched
Sent from my Nexus 4 using Tapatalk 2
Click to expand...
Click to collapse
From what I remember the iPhone 5 and the new iPad wiped the floor with Nexus 4 and 10. The ST-Ericsson Nova A9600 is likely to have a PowerVR Rogue GPU. Just can't wait!!
adityak28 said:
From what I remember the iPhone 5 and the new iPad wiped the floor with Nexus 4 and 10. The ST-Ericsson Nova A9600 is likely to have a PowerVR Rogue GPU. Just can't wait!!
Click to expand...
Click to collapse
That isn't true , check glbenchmark , in the off screen test the iPhone scored 91 , the nexus 4 scored 88 ... That ksnt wiping my floors
Sent from my Nexus 10 using Tapatalk HD
Its interesting how even though nvidia chips arent the best we still get the best game graphics because of superior optimization through tegra zone. Not even the a6x is as fully optimized.
Sent from my SAMSUNG-SGH-I727 using xda premium
ian1 said:
Its interesting how even though nvidia chips arent the best we still get the best game graphics because of superior optimization through tegra zone. Not even the a6x is as fully optimized.
Sent from my SAMSUNG-SGH-I727 using xda premium
Click to expand...
Click to collapse
What sort of 'optimisation' do you mean? un optimised games lag that's a big letdown and tegra effects can also be used on other phones too with chain fire 3d I use it and tegra games work without lag with effects and I don't have a tegra device
With a tegra device I am restricted to optimised games mostly
The graphic performance of NVIDIA SoCs is always disappointed, sadly for the VGA dominanting provider on the world.
The first Tegra2, the GPU is a little bit better than SGX540 of GalaxyS a little bit in benchmark, but lacking NEON support.
The second one Tegra 3, the GPU is nearly the same as the old Mali400Mp4 in GALAXY S2/Original Note.
And now it's better but still nothing special and outperformed soon (Adreno 330 and next-gen Mali)
Strongest PowerVR GPUs are always the best, but sadly they are exclusive for Apple only (SGX543 and maybe SGX 554 also, only Sony ,who has the cross-licencing with Apple, has it in PS Vita and in PS Vita only)
tegra optimization porting no longer works using chainfire, this is now a myth
did u manage to try shadowgun thd, zombie driver or horn? the answer is no, games that use t3 sdk for physx and other cpu graphics works can not be forced to work on other devices, equally chainfire is now outdated and no longer updated
now about power vr they are only better in real multicore configuration which is only used by apple and Sony's vita, eating large die area, ie actual multicore each with its own subcores/shaders, if tegra was used in real multi core it would destroy all
finally this is really funny all this doom n gloom because of an early discarded development board benchmark, I dont mean to take away from turbo's thunder and his find but truly its ridiculous the amount of negativity its is collecting before any type of final device benchs
adrena 220 doubled in performance after the ICS update on sensation
t3 doubled the speed of t2 gpu with only 50% the number of shaders so how on earth do you believe only 2x the t3 scores with 600% more shaders!!
do you have any idea how miserable the ps3 performed in its early days? even new desktop GeForces perform much less than expected until the drivers are updated
enough with the FUD! seems this board is full of it nowadays and so little reasoning...
For goodness sake, this isn't final hardware, anything could change. Hung2900 knows nothing, what he stated isn't true. Samsung has licensed PowerVR, it isn't just stuck to Apple, just that Samsung prefers using ARMs GPU solution. Another thing I dislike is how everyone is comparing a GPU in the iPad 4 (SGX554MP4) that will NEVER arrive in a phone compared a Tegra 4 which will arrive in a phone. If you check OP link the benchmark was posted on the 3rd of January with different results (18fps then 33fps), so there is a chance it'll rival the iPad 4. I love Tegra as Nvidia is pushing developers to make more better games for Android compared to the 'geeks' *cough* who prefers benchmark results, whats the point of having a powerful GPU if the OEM isn't pushing developers to create enhance effect games for there chip.
Hamdir is correct about the GPUs, if Tegra 3 was around 50-80% faster than Tegra 2 with just 4 more cores, I can't really imagine it only being 2x faster than Tegra 3. Plus its a 28nm (at around 80mm2 just a bit bigger than Tegra 3, smaller than A6 90mm2) along with the dual memory than single on Tegra 2/3.
Turbotab said:
There was a great article on Anandtech that tested the power consumption of the Nexus 10's Exynos 5250 SoC, it showed that both the CPU and GPU had a TDP of 4W, making a theoretical SoC TDP of 8W. However when the GPU was being stressed by running a game, they ran a CPU benchmark in the background, the SoC quickly went up to 8W, but the CPU was quickly throttled from 1.7 GHz to just 800 Mhz as the system tried to keep everything at 4W or below, this explained why the Nexus 10 didn't benchmark as well as we wished.
Back to the 5450 which should beat the A6X, trouble is it has double the CPU & GPU cores of the 5250 and is clocked higher, even on a more advanced 28nm process, which will lower power consumption I feel that system will often be throttled because of power and maybe heat concerns, so it looks amazing on paper but may disappoint in reality, and a 5450 in smartphone is going to suffer even more.
So why does Apple have an advantage?, well basically money, for a start iSheep will pay more for their devices, so they afford to design a big SoC and big batteries that may not be profitable to other companies. Tegra 4 is listed as a 80mm2 chip, iPhone 5 is 96mm2 and A6X is 123mm2, Apple can pack more transistors and reap the GPU performance lead, also they chosen graphics supplier Imagination Technologies have excellent products, Power VR Rogue will only increase Apple's GPU lead. They now have their own chip design team, the benefit for them has been their Swift core is almost as powerful as ARM A15, but seems less power hungry, anyway Apple seems to be happy running slower CPUs compared to Android. Until an Android or WP8 or somebody can achieve Apple's margins they will be able to 'buy' their way to GPU domination, as an Android fan it makes me sad:crying:
Click to expand...
Click to collapse
Well said mate!
I can understand what you feel, nowdays android players like samsung,nvidia are focusing more on CPU than GPU.
If they won't stop soon and continued to use this strategy they will fail.
GPU will become bottleneck and you will not be able use the cpu at its full potential. (Atleast when gaming)
i have Galaxy S2 exynos 4 1.2Ghz and 400mhz oc mali gpu
In my analysis most modern games like MC4,NFS:MW aren't running at 60FPS at all thats because GPU always have 100% workload and CPU is relaxing there by outputing 50-70% of total CPU workload
I know some games aren't optimize for all android devices as opposed to apple devices but still even high-end android devices has slower gpu (than ipad 4 atleast )
AFAIK, Galaxy SIV is likely to pack T-604 with some tweaks instead of mighty T-658 which is still slower than iPAddle 4
Turbotab said:
There was a great article on Anandtech that tested the power consumption of the Nexus 10's Exynos 5250 SoC, it showed that both the CPU and GPU had a TDP of 4W, making a theoretical SoC TDP of 8W. However when the GPU was being stressed by running a game, they ran a CPU benchmark in the background, the SoC quickly went up to 8W, but the CPU was quickly throttled from 1.7 GHz to just 800 Mhz as the system tried to keep everything at 4W or below, this explained why the Nexus 10 didn't benchmark as well as we wished.
Back to the 5450 which should beat the A6X, trouble is it has double the CPU & GPU cores of the 5250 and is clocked higher, even on a more advanced 28nm process, which will lower power consumption I feel that system will often be throttled because of power and maybe heat concerns, so it looks amazing on paper but may disappoint in reality, and a 5450 in smartphone is going to suffer even more.
So why does Apple have an advantage?, well basically money, for a start iSheep will pay more for their devices, so they afford to design a big SoC and big batteries that may not be profitable to other companies. Tegra 4 is listed as a 80mm2 chip, iPhone 5 is 96mm2 and A6X is 123mm2, Apple can pack more transistors and reap the GPU performance lead, also they chosen graphics supplier Imagination Technologies have excellent products, Power VR Rogue will only increase Apple's GPU lead. They now have their own chip design team, the benefit for them has been their Swift core is almost as powerful as ARM A15, but seems less power hungry, anyway Apple seems to be happy running slower CPUs compared to Android. Until an Android or WP8 or somebody can achieve Apple's margins they will be able to 'buy' their way to GPU domination, as an Android fan it makes me sad:crying:
Click to expand...
Click to collapse
Typical "isheep" reference, unnecessary.
Why does apple have the advantage? Maybe because there semiconductor team is talented and can tie the A6X+PowerVR GPU efficiently. NIVIDA should have focused more on GPU in my opinion as the CPU was already good enough. With these tablets pushing excess of 250+ppi the graphics processor will play a huge role. They put 72 cores in there processor. Excellent. Will the chip ever be optimized to full potential? No. So again they demonstrated a product that sounds good on paper but real world performance might be a different story.
MrPhilo said:
For goodness sake, this isn't final hardware, anything could change. Hung2900 knows nothing, what he stated isn't true. Samsung has licensed PowerVR, it isn't just stuck to Apple, just that Samsung prefers using ARMs GPU solution. Another thing I dislike is how everyone is comparing a GPU in the iPad 4 (SGX554MP4) that will NEVER arrive in a phone compared a Tegra 4 which will arrive in a phone. If you check OP link the benchmark was posted on the 3rd of January with different results (18fps then 33fps), so there is a chance it'll rival the iPad 4. I love Tegra as Nvidia is pushing developers to make more better games for Android compared to the 'geeks' *cough* who prefers benchmark results, whats the point of having a powerful GPU if the OEM isn't pushing developers to create enhance effect games for there chip.
Hamdir is correct about the GPUs, if Tegra 3 was around 50-80% faster than Tegra 2 with just 4 more cores, I can't really imagine it only being 2x faster than Tegra 3. Plus its a 28nm (at around 80mm2 just a bit bigger than Tegra 3, smaller than A6 90mm2) along with the dual memory than single on Tegra 2/3.
Click to expand...
Click to collapse
Firstly please keep it civil, don't go around saying that people know nothing, people's posts always speak volumes. Also calling people geeks, on XDA is that even an insult, next you're be asking what I deadlift:laugh:
My OP was done in the spirit of technical curiosity, and to counter the typical unrealistic expectations of a new product on mainstream sites, e.g. Nvidia will use Kepler tech (which was false), omg Kepler is like GTX 680, Tegra 4 will own the world, people forget that we are still talking about device that can only use a few watts, and must be passively cooled and not a 200+ watt, dual-fan GPU, even though they both now have to power similar resolutions, which is mental.
I both agree and disagree with your view on Nvidia's developer relationship, THD games do look nice, I compared Infinity Blade 2 on iOS vs Dead Trigger 2 on youtube, and Dead Trigger 2 just looked richer, more particle & physics effects, although IF Blade looked sharper at iPad 4 native resolution, one of the few titles to use the A6x's GPU fully.The downside to this relationship is the further fragmentation of the Android ecosystem, as Chainfire's app showed most of the extra effects can run on non Tegra devices.
Now, a 6 times increase in shader, does not automatically mean that games / benchmarks will scale in linear fashion, as other factors such as TMU /ROP throughput can bottleneck performance. Nvidia's Technical Marketing Manager, when interviewed at CES, said that the overall improvement in games / benchmarks will be around 3 to 4 times T3. Ultimately I hope to see Tegra 4 in a new Nexus 7, and if these benchmarks are proved accurate, it wouldn't stop me buying. Overall including the CPU, it would be a massive upgrade over the current N7, all in the space of a year.
At 50 seconds onwards.
https://www.youtube.com/watch?v=iC7A5AmTPi0
iOSecure said:
Typical "isheep" reference, unnecessary.
Why does apple have the advantage? Maybe because there semiconductor team is talented and can tie the A6X+PowerVR GPU efficiently. NIVIDA should have focused more on GPU in my opinion as the CPU was already good enough. With these tablets pushing excess of 250+ppi the graphics processor will play a huge role. They put 72 cores in there processor. Excellent. Will the chip ever be optimized to full potential? No. So again they demonstrated a product that sounds good on paper but real world performance might be a different story.
Click to expand...
Click to collapse
Sorry Steve, this is an Android forum, or where you too busy buffing the scratches out of your iPhone 5 to notice? I have full respect for the talents of Apple's engineers & marketing department, many of its users less so.
hamdir said:
tegra optimization porting no longer works using chainfire, this is now a myth
did u manage to try shadowgun thd, zombie driver or horn? the answer is no, games that use t3 sdk for physx and other cpu graphics works can not be forced to work on other devices, equally chainfire is now outdated and no longer updated
Click to expand...
Click to collapse
Looks like they haven't updated chain fire 3d for a while as a result only t3 games don't work but others do work rip tide gp, dead trigger etc . It's not a myth but it is outdated and only works with ics and tegra 2 compatible games . I think I (might be) unfortunate too but some gameloft games lagged on tegra device that i had, though root solved it too an extent
I am not saying something is superior to something just that my personal experience I might be wrong I may not be
Tbh I think benchmarks don't matter much unless you see some difference in real world usage and I had that problem with tegra in my experience
But we will have to see if the final version is able to push it above Mali t 604 and more importantly sgx544
Turbotab said:
Sorry Steve, this is an Android forum, or where you too busy buffing the scratches out of your iPhone 5 to notice? I have full respect for the talents of Apple's engineers & marketing department, many of its users less so.
Click to expand...
Click to collapse
No I actually own a nexus 4 and ipad mini so I'm pretty neutral in googles/apples ecosystems and not wiping any scratches off my devices.
Interesting results here. Everybody has been saying the G2 is quicker and better then Note 3 and I must say I am quite shocked with these findings so far
http://thedroidguy.com/2013/09/sams...-sony-xperia-z1-vs-lg-g2-benchmark-comparison
i dont care. n3 is the better phone.
oh i dont disagree i agree 100% that is why i have a note 3 coming and im not stopping at verizon today to see the overrated g2!
hah G2 is like a on screen buttoned Galaxy S4 LG is copying Samsung on many things these days -_-
Blackwolf10 said:
hah G2 is like a on screen buttoned Galaxy S4 LG is copying Samsung on many things these days -_-
Click to expand...
Click to collapse
I know right! everything almost looks the same. Its like there are a dev and just made a rooted s4 with some new ui looks!
Here's a potential difference. There are two versions of S-800; MSM8974 and MSM8974AB. Here's AnandTech's take...
Xiaomi makes the first (to my knowledge) public disclosure of MSM8974AB, which is analogous to the changes we saw between APQ8064 and APQ8064AB. From 8974 to 8974AB, Adreno 330 GPU clocks climb from 450 MHz to 550 MHz, LPDDR3 memory interface maximum data rates go from 800 MHz to 933 MHz, and the ISP clock domain (I think Xiaomi might mean the Hexagon DSP here) goes from 320 MHz to 465 MHz. 8974 comes in both a bin with the 4 Krait 400 CPUs clocked at 2.2 GHz (really 2.15 GHz) and 2.3 GHz (2.26 GHz) with slightly different pricing, while 8974AB comes with a Krait 400 clock available only at 2.3 GHz. Process is still TSMC 28nm HPM, but I suspect that the AB variant might have the high k dielectric and/or transistor mix tuned slightly differently based on a few rumblings I've heard recently.The S-600 in the SGS4 was "AB" so the the S-800 in the N3 might be also. We'll find out when more detailed reviews start to come out.
From AnandTech discussing the SGS4's S-600 chip...
That brings us to the Galaxy S 4. It's immediately apparent that something is different here because Samsung is shipping the Snapdragon 600 at a higher frequency than any other OEM. The Krait 300 cores in SGS4 can run at up to 1.9GHz vs. 1.7GHz for everyone else. Curiously enough, 1.9GHz is the max frequency that Qualcomm mentioned when it first announced Snapdragon 600.
Samsung is obviously a very large customer, so at first glance we assumed it could simply demand a better bin of Snapdragon 600 than its lower volume competitors. Looking a bit deeper however, we see that the Galaxy S 4 uses something different entirely.
Digging through the Galaxy S 4 kernel source we see references to an APQ8064AB part. As a recap, APQ8064 was the first quad-core Krait 200 SoC with no integrated modem, more commonly referred to as Snapdragon S4 Pro. APQ8064T was supposed to be its higher clocked/Krait 300 based successor that ended up with the marketing name Snapdragon 600. APQ8064AB however is, at this point, unique to the Galaxy S 4 but still carries the Snapdragon 600 marketing name.
If we had to guess, we might be looking at an actual respin of the APQ8064 silicon in APQ8064AB. Assuming Qualcomm isn't playing any funny games here, APQ8064AB may simply be a respin capable of hitting higher frequencies. We'll have to keep a close eye on this going forward, but it's clear to me that the Galaxy S 4 is shipping with something different than everyone else who has a Snapdragon 600 at this point.
BarryH_GEG said:
Here's a potential difference. There are two versions of S-800; MSM8974 and MSM8974AB. Here's AnandTech's take...
Xiaomi makes the first (to my knowledge) public disclosure of MSM8974AB, which is analogous to the changes we saw between APQ8064 and APQ8064AB. From 8974 to 8974AB, Adreno 330 GPU clocks climb from 450 MHz to 550 MHz, LPDDR3 memory interface maximum data rates go from 800 MHz to 933 MHz, and the ISP clock domain (I think Xiaomi might mean the Hexagon DSP here) goes from 320 MHz to 465 MHz. 8974 comes in both a bin with the 4 Krait 400 CPUs clocked at 2.2 GHz (really 2.15 GHz) and 2.3 GHz (2.26 GHz) with slightly different pricing, while 8974AB comes with a Krait 400 clock available only at 2.3 GHz. Process is still TSMC 28nm HPM, but I suspect that the AB variant might have the high k dielectric and/or transistor mix tuned slightly differently based on a few rumblings I've heard recently.The S-600 in the SGS4 was "AB" so the the S-800 in the N3 might be also. We'll find out when more detailed reviews start to come out.
From AnandTech discussing the SGS4's S-600 chip...
That brings us to the Galaxy S 4. It's immediately apparent that something is different here because Samsung is shipping the Snapdragon 600 at a higher frequency than any other OEM. The Krait 300 cores in SGS4 can run at up to 1.9GHz vs. 1.7GHz for everyone else. Curiously enough, 1.9GHz is the max frequency that Qualcomm mentioned when it first announced Snapdragon 600.
Samsung is obviously a very large customer, so at first glance we assumed it could simply demand a better bin of Snapdragon 600 than its lower volume competitors. Looking a bit deeper however, we see that the Galaxy S 4 uses something different entirely.
Digging through the Galaxy S 4 kernel source we see references to an APQ8064AB part. As a recap, APQ8064 was the first quad-core Krait 200 SoC with no integrated modem, more commonly referred to as Snapdragon S4 Pro. APQ8064T was supposed to be its higher clocked/Krait 300 based successor that ended up with the marketing name Snapdragon 600. APQ8064AB however is, at this point, unique to the Galaxy S 4 but still carries the Snapdragon 600 marketing name.
If we had to guess, we might be looking at an actual respin of the APQ8064 silicon in APQ8064AB. Assuming Qualcomm isn't playing any funny games here, APQ8064AB may simply be a respin capable of hitting higher frequencies. We'll have to keep a close eye on this going forward, but it's clear to me that the Galaxy S 4 is shipping with something different than everyone else who has a Snapdragon 600 at this point.
Click to expand...
Click to collapse
so could be why we are seeing higher scores in the test note 3?
Why are people knocking the G2? It's the second fastest device on the market. It has an amazing screen area ratio and a very nice battery. It's camera is also one of the best. I would never consider it because I can never go back below 5.5 inches and I can't stand on screen buttons. But that phone should make a lot of people very happy.
Techweed said:
Why are people knocking the G2? It's the second fastest device on the market. It has an amazing screen area ratio and a very nice battery. It's camera is also one of the best. I would never consider it because I can never go back below 5.5 inches and I can't stand on screen buttons. But that phone should make a lot of people very happy.
Click to expand...
Click to collapse
im not saying its not a nice phone but nothing that "wows" me. It looks worse then Touch Wiz not a huge fan of but its ok (sense is my fav), the phone doesnt have sdcard and removable battery also a no no (why i didnt buy htc one), Note 3 has better specs with an spen and loads of new features. G2 looks like a rooted S4 running a launcher and i wasnt impressed by S4. So with that being said this is just a tad faster S4 with same look almost. Now Note 3 you may say is same look as S4 while it is, it at least carries an sdcard and removable battery and the dev support should be behind sammy. Also i do remember LG making an Intuition, revolution, lucid? whatever happened to those? oh thats right they fell through the cracks. LG just cant compete with samsung, htc, or even motorola right now
oneandroidnut said:
Interesting results here. Everybody has been saying the G2 is quicker and better then Note 3 and I must say I am quite shocked with these findings so far
http://thedroidguy.com/2013/09/sams...-sony-xperia-z1-vs-lg-g2-benchmark-comparison
Click to expand...
Click to collapse
Everybody? Who's saying that?
BTW, that article is useless. They are combining results from various places - PhoneArena/GSMArena etc.,
They took GN3 numbers from here: http://blog.gsmarena.com/the-first-benchmarks-scores-of-samsung-galaxy-note-3-are-in/
They also added some from PhoneArena: http://www.youtube.com/watch?v=NBwq0iAoVzQ
One major thing everyone forgets is that running benchmark from display models in launch events is plain wrong.
A] Most phones in such events (IFA, CES, MWC) are always charging. You should never benchmark when the phones is charging.
B] Have you ever seen any 'reviewer' in those shows to reboot the phone before running benchmarks? These display phones are abused by tech-journos. Tons of things would be running in the background. Yes, nobody bothers to clear the memory by rebooting it once. What's the point of such benchmark? Not to talk about thermal envelope after using these phones continuously.
C] G2 running release firmware, rest 2 phones running pre-release version.
(IMO) AnTuTu shouldn't be considered as a good benchmark. A benchmark tool must provide consistent repeatable result. If you run AnTuTu 5 times, I guarantee you that you will get variable result most times. No wonder AT doesn't like using AnTuTu.
Benchmarks never killed a phone :angel::angel:
CLARiiON said:
Everybody? Who's saying that?
BTW, that article is useless. They are combining results from various places - PhoneArena/GSMArena etc.,
They took GN3 numbers from here: http://blog.gsmarena.com/the-first-benchmarks-scores-of-samsung-galaxy-note-3-are-in/
They also added some from PhoneArena: http://www.youtube.com/watch?v=NBwq0iAoVzQ
One major thing everyone forgets is that running benchmark from display models in launch events is plain wrong.
A] Most phones in such events (IFA, CES, MWC) are always charging. You should never benchmark when the phones is charging.
B] Have you ever seen any 'reviewer' in those shows to reboot the phone before running benchmarks? These display phones are abused by tech-journos. Tons of things would be running in the background. Yes, nobody bothers to clear the memory by rebooting it once. What's the point of such benchmark? Not to talk about thermal envelope after using these phones continuously.
C] G2 running release firmware, rest 2 phones running pre-release version.
(IMO) AnTuTu shouldn't be considered as a good benchmark. A benchmark tool must provide consistent repeatable result. If you run AnTuTu 5 times, I guarantee you that you will get variable result most times. No wonder AT doesn't like using AnTuTu.
Benchmarks never killed a phone :angel::angel:
Click to expand...
Click to collapse
I hate benchmarks at events and real life situations is where it's at. We just need to wait till some more note 3 make it into the wild
Sent from my Nexus 7 using Tapatalk 2
oneandroidnut said:
Everybody has been saying the G2 is quicker and better then Note 3
Click to expand...
Click to collapse
Why would anyone say that? No one even has the Note 3, so we have to default to expectations. Why would anyone expect the the similar but faster clocked phone to be slower?
dscline said:
Why would anyone say that?
Click to expand...
Click to collapse
Show "anyone" this. All the tests were conducted by the same source; GSMArena.
Benchmark PI
AnTuTu
Linpack
Egypt (Offscreen)
T-Rex (Offscreen)
Sunspider
BarryH_GEG said:
Show "anyone" this. All the tests were conducted by the same source; GSMArena.
Benchmark PI
AnTuTu
Linpack
Egypt (Offscreen)
T-Rex (Offscreen)
Sunspider
Click to expand...
Click to collapse
no g2 on that list though
oneandroidnut said:
no g2 on that list though
Click to expand...
Click to collapse
Enjoy -- http://www.gsmarena.com/lg_g2-review-982p5.php
oneandroidnut said:
no g2 on that list though
Click to expand...
Click to collapse
Oops, I thought "anyone" was saying the N2 was faster than the N3. My bad.
Here's the G2 numbers, again all from a single source; GSMArena.
Benchmark PI
Linpack
AnTuTu
Egypt (Offscreen)
T-Rex (Offscreen)
Sunspider
In case anyone's bummed about the lower AnTuTu score here's a score taken from a production unit that was reviewed by a Russian site. GSMArena conducted their tests on demo units at the Berlin launch event. Based on these scores I'd bet anyone here the N3 is using a "AB" chip where the XZ Ultra and LG G2 aren't. So, at least for the time being, the N3's the fastest Android device on the planet.
But not to be a buzz kill, the SGS4 got fantastic benchmarks but had some lag in early s/w releases due to the ton-'O-crap Samsung had loaded on it. It improved over time and the N3 has more RAM so I'm hoping benchmarks translate in to "feel."
http://translate.googleusercontent....v.html&usg=ALkJrhha6VTm0y89eM70OxVC5rPRLSw6nw
BarryH_GEG said:
Oops, I thought "anyone" was saying the N2 was faster than the N3. My bad.
Here's the G2 numbers, again all from a single source; GSMArena.
Benchmark PI
Linpack
AnTuTu
Egypt (Offscreen)
T-Rex (Offscreen)
Sunspider
In case anyone's bummed about the lower AnTuTu score here's a score taken from a production unit that was reviewed by a Russian site. GSMArena conducted their tests on demo units at the Berlin launch event. Based on these scores I'd bet anyone here the N3 is using a "AB" chip where the XZ Ultra and LG G2 aren't. So, at least for the time being, the N3's the fastest Android device on the planet.
But not to be a buzz kill, the SGS4 got fantastic benchmarks but had some lag in early s/w releases due to the ton-'O-crap Samsung had loaded on it. It improved over time and the N3 has more RAM so I'm hoping benchmarks translate in to "feel."
http://translate.googleusercontent....v.html&usg=ALkJrhha6VTm0y89eM70OxVC5rPRLSw6nw
Click to expand...
Click to collapse
thanks man! and i cant wait to get my hands on one! and dont know who would keep a n2 over the n3 lol
All I know is that my S4 always benches higher than my HTC One. S4 using the "higher" binned S600.
In real world use, the HTC One felt twice as fast as the S4. Even rooted and running a custom debloated rom and kernel overclocked to 2.1GHz, the S4 still was laggy and much MUCH slower than a stock HTC One. The S4 would lag and stutter all over the place despite showing the superior numbers so I now take benchmarks with a grain of salt.
I'm really hoping Samsung gets it together and instead of just showing higher benchmark numbers, actually perform in real world use like the numbers indicate.
I'm using an LG G2 right now while waiting for my GNote3, so far I am IN LOVE with the G2. It's hands down the fastest device I've ever used, Nothing slows this thing down and I have yet to encounter a hint of lag or micro stuttering. Battery life matches or exceeds my Note 2 which I thought was incredible, I'm not too worried about the non-removable battery anymore. The screen is by far the best display I have seen, and the camera is amazingly good with OIS. In my opinion the S4 is not even in the same league as the G2, hardware or software wise. I really loved my Note 2 and have my fingers crossed the Note 3 doesn't have the incredibly frustrating laggy experience that plagued both my S4's. I would really love to keep the Note 3 as my main device because I actually use the S-pen a lot.
Dan37tz said:
I'm using an LG G2 right now while waiting for my GNote3, so far I am IN LOVE with the G2. It's hands down the fastest device I've ever used, Nothing slows this thing down and I have yet to encounter a hint of lag or micro stuttering. Battery life matches or exceeds my Note 2 which I thought was incredible, I'm not too worried about the non-removable battery anymore. The screen is by far the best display I have seen, and the camera is amazingly good with OIS. In my opinion the S4 is not even in the same league as the G2, hardware or software wise. I really loved my Note 2 and have my fingers crossed the Note 3 doesn't have the incredibly frustrating laggy experience that plagued both my S4's. I would really love to keep the Note 3 as my main device because I actually use the S-pen a lot.
Click to expand...
Click to collapse
The G2 could be considered a "next gen" phone because of S-800 and the additional features LG's provided. The One and SGS4 with S-600 are previous generation phones. Sadly for SGS_ owners, their device is released before the N_ is and Samsung learns from issues with the SGS_ what not to do in the N_. The SGS3 Exynos with 1GB of RAM vs 2GB in the N2 is a good example.
I share your fears though. The launch s/w on the SGS4 was pretty bad. But I'm hoping that 3GB of RAM, S-800 "AB," and "lessons learned" will make the N3 as big an improvement over the SGS4 as the N2 was over the SGS3. I had no issues with the stock unrooted performance of the N2.
As for "fastest" that's subjective. I don't personally get off on millisecond faster screen transitions as much as I do on 30% faster browser performance which Sunspider indicates the N3 achieves over the G2. Where Samsung phones are "fast" for me is in how, through their features, they allow me to get stuff done faster and in ways I can't with other manufacturer’s devices.
I also don't consider the G2 in anyway a competitor to the N3. One's clearly a "phone" and the other's clearly a "phablet" with S Pen/S Note making the difference even greater. And the G2's lack of expandable storage is a step back not forward. That and the non-removable battery take it off my shopping list even if I were considering a "phone."
BarryH_GEG said:
I share your fears though. The launch s/w on the SGS4 was pretty bad. But I'm hoping that 3GB of RAM, S-800 "AB," and "lessons learned" will make the N3 as big an improvement over the SGS4 as the N2 was over the SGS3. I had no issues with the stock unrooted performance of the N2."
Click to expand...
Click to collapse
For the "AB" thing, I think, then, Note 3 is supposed to have Adreno 330 clocked at 550 MHz. Have you find any info regarding that?
BarryH_GEG said:
I also don't consider the G2 in anyway a competitor to the N3. One's clearly a "phone" and the other's clearly a "phablet" with S Pen/S Note making the difference even greater. And the G2's lack of expandable storage is a step back not forward. That and the non-removable battery take it off my shopping list even if I were considering a "phone."
Click to expand...
Click to collapse
Apart from your buying preference, if it were for the image stabilization how'd you see Note 3 over G2 in terms of "smart stabilization" vs OIS?