Part 2/5 for The Performance Series - EVO 4G Themes and Apps

Doing a 5 part series at The Droid Demos Check it out!
Welcome to the second part in a 5 part series of Android performance benchmarking apps. Neocore is an app to measure the OpenGL-ES 1.1, or for those confused, 3D graphics performance. You get to watch a cool video in the process! It then reports the FPS that your phone can display. Download today for a great app to compare Android phones and different ROMs!
Stay tuned for the continuation of the Android Performance Benchmark Series.
Part 2/5 - Necore for 3D Graphics Performance
For Part 1, check out Benchmark Pi for CPU Performance Benchmarking

Linpack – Most Popular Android CPU Performance Test [Performance Series]
Doing a 5 part series at The Droid Demos Check it out!
Welcome to the third part in a 5 part series of Android performance benchmarking apps. Linpack, like Benchmark Pi, is an app to measure CPU performance by having your Android phone calculate pi. It reports your performance in millions of floating point operations per second (MFLOPS). Download today for a great app to compare Android phones, compare ROMs, or just try to beat the high scores of other users.Ms!
Stay tuned for the continuation of the Android Performance Benchmark Series.
For Part 3, check out Linpack, the Most Popular CPU Performance Test
For Part 2, check out Necore for 3D Graphics Performance
For Part 1, check out Benchmark Pi for CPU Performance Benchmarking

Related

Linpack – Most Popular Android CPU Performance Test [Performance Series]

Doing a 5 part series at The Droid Demos Check it out!
Welcome to the third part in a 5 part series of Android performance benchmarking apps. Linpack, like Benchmark Pi, is an app to measure CPU performance by having your Android phone calculate pi. It reports your performance in millions of floating point operations per second (MFLOPS). Download today for a great app to compare Android phones, compare ROMs, or just try to beat the high scores of other users.Ms!
Stay tuned for the continuation of the Android Performance Benchmark Series.
For Part 3, check out Linpack, the Most Popular CPU Performance Test
For Part 2, check out Necore for 3D Graphics Performance
For Part 1, check out Benchmark Pi for CPU Performance Benchmarking

Fps2D – Test Your Android’s Frames Per Second Performance [Performance Series]

Doing a 5 part series at The Droid Demos Check it out!
Welcome to the fourth part in a 5 part series of Android performance benchmarking apps. Fps2D, like Neocore, is an app to measure Android’s frames per second performance. However, Fps2D, as the name implies, tests 2D performance rather than the 3D performance that Neocore tests. Because of this, we are able to see the true performance of the EVO after applying the FPS fix. Download today for a great app to compare Android phones or compare ROMs.
Stay tuned for the continuation of the Android Performance Benchmark Series.
For Part 4, check out Fps2D for 2D FPS Benchmarking
For Part 3, check out Linpack, the Most Popular CPU Performance Test
For Part 2, check out Necore for 3D Graphics Performance
For Part 1, check out Benchmark Pi for CPU Performance Benchmarking

Android benchmarks vs iPhone 4

Is there a way to take benchmark tests from Android and iOS devices that register on the same scale? Not JavaScript benchmarks, benchmarks like on like Quadrant Standard or Neocore.

Electopia Benchmark

For giggles, can one of you that's stock run the Electopia benchmark? There's been some interesting results and it would be cool to see how another dual-core phone with a different CPU/GPU performs. The Sensation folks are obviously not amused.
Sensation
800x480
Average FPS: 23.65
Time: 60
Number of Frames: 1419
Trianglecount: 48976
Peak Trianglecount: 68154
960x540
Average FPS: 19.90
Time: 60.01
Number of Frames: 1194
Trianglecount: 49415
Peak Trianglecount: 67076
SGS2
Average FPS: 37.58
Time: 60.01
Number of frames: 2255
Trianglecount: 48633
Peak trianglecount: 68860
DHD
Average FPS: 23.36
Time: 60.03
Number of frames: 1402
Trianglecount: 48835
Peak trianglecount: 67628
Even the Desire HD blew away my G2x on this benchmark but it could be the custom ROM... I'll switch back to AOSP and try it again.
16FPS
Can't be right, my Thunderbolt smoked my g2x
26 FPS Thunderbolt vs 16FPS G2x
Something is very wrong with those numbers if this is supposed to be measuring opengl 2.0
I have stock and with a really hard time getting it to respond to touch input and with the sound off here are the scores:
Average FPS - 15.56
Time - 60.04
Number of Frames - 934
Trianglecount - 48928
Peak Trianglecount - 68838
This was a super buggy program on the G2x. I think it is definitely not optimized for dual core or at least the Tegra 2 architecture.
Sent from my T-Mobile G2x using XDA App
There's no way the G2X would be lower than the Sensation. The test probably isn't dual-core optimized and the new CPU/GPUs are throwing it off. Thanks for trying though.
BarryH_GEG said:
There's no way the G2X would be lower than the Sensation. The test probably isn't dual-core optimized and the new CPU/GPUs are throwing it off. Thanks for trying though.
Click to expand...
Click to collapse
And there is no way the g2x could be lower than a single core adreno 205 Thunderbolt.
15.57 FPS for me running stock/not rooted. Like previously mentioned, it was very unresponsive to touch.
Badly designed benchmark programs are bad.
diablos991 said:
Badly designed benchmark programs are bad.
Click to expand...
Click to collapse
The sad part is that this isn't just a benchmark - its a game first and foremost.
And yeah I can't get past 16FPS on stock speed OR at 1.5GHz so I think there's definitely coding issues as Nenamark using Trinity on Bionic scores 72FPS. I think my Inspire (Adreno 205) got about 35?
+1
Lets all buy phones with top benchmarks!!!!!!
Better yet lets all get iPhones.....
Fu*k a benchmark
Sent from my LG-P999 using XDA Premium App
But if you really stop and think about it, if each of the different CPU/GPU's behave differently running the same software because of proprietary hardware performance tweaks we'll all be screwed in the long run. No matter how Electopia was written, one would think it would behave the same way on different CPU/GPU combinations - even if it wasn't dual-core optimized. So developers are either going to have to start testing on every CPU/GPU combo to release a single version of an app or release different apps for different CPU/GPUs. It's way too early to tell as dual-core and 2.3ish isn't that common now but it should be interesting watching software performance and development play out in the future.
BarryH_GEG said:
But if you really stop and think about it, if each of the different CPU/GPU's behave differently running the same software because of proprietary hardware performance tweaks we'll all be screwed in the long run. No matter how Electopia was written, one would think it would behave the same way on different CPU/GPU combinations - even if it wasn't dual-core optimized. So developers are either going to have to start testing on every CPU/GPU combo to release a single version of an app or release different apps for different CPU/GPUs. It's way too early to tell as dual-core and 2.3ish isn't that common now but it should be interesting watching software performance and development play out in the future.
Click to expand...
Click to collapse
Its piss-poor coding on the app developer's part - plain and simple. While there are Tegra 2-specific instructions that an app developer can use in their application, there are not any mobile OpenGL 2.0 instructions the Tegra 2 doesn't support as far as I am aware.
If you want a good challenge for the chip, download an3dbench XL from Market. I just scored 32640 and that's with a bunch of background apps.
Isn't this a windows mobile port (had it on my HD2 running WM6.5)? So, how does it provide an accurate representation of gaming on an Android device? Since it is the only bench my G2x has scored poorly on and (more importantly) real world gaming is spectacular on this thing, I'm going to say it doesn't. I wouldn't put a whole lot of stock in this one...
Yeah agreed. I just ran it on the Nexus/CM7 AOSP hybrid and it still was only 16.06 while I got almost 40,000 on an3dbenchXL which put me like 30-something out of 7000ish results.
This application was influenced by Qualcomm specifically to run poorly on Tegra 2 devices. They messed with the shaders so everything is rendered at a weird angle. If you change the code to run with a normal approach, you see the same results on Qualcomm chips but also 3-5x perf on NVIDIA chips
why would you say this benchmark was influenced? if you have the sources ..please share .. so we can all look ... and how can you say BenchXL is a good benchmark? I have run BenchXL Benchmark and seen un matching results on many forums ... it is very unreliable... not a good benchmark. At least electopia gives consistent reliable results... I would go with electopia as a GPU benchmark ..
i have a xperia play for myself - which performs superb for gaming - awesome graphics - i love the games on it - awesome device. my wife has g2x - which is equally good for gaming (thought she just uses it for texting - LOL )....
i think for gaming both xperia play and g2x are good...
I'd hardly say it's biased to any one specific manufacturer based on these benchmarks:
More so I ran it myself with the latest firmware at stock frequencies (SGS2 btw ) and got:
Average FPS: 51.44
Time: 60.02
Number of frames: 3087
Trianglecount: 48827
Peak trianglecount: 68868
Quite funny difference to any other device I might say.
It's not biased towards any manufacturer, it is biased against NVIDIA's ULP GeForce GPUs in Tegra 2 SOCs.
Changes to the code cause increases in performance on Tegra 2 devices, while results on other platforms do not change.
In general, there is never a single, all-encompassing GPU benchmark to accurately compare devices. It all depends on the code, and how it interacts with the specific hardware of the device.
images |DOT| anandtech |DOT| com /graphs/graph4177/35412.png
images |DOT| anandtech |DOT| com /graphs/graph4177/35412.png
Source: Anandtech Samsung Galaxy S2 review (I can't post links )
http://images.anandtech.com/graphs/graph4177/35412.png
That AnandTech review is badly outdated, like I said; the SGS2 gets for example 16fps there in February. I myself get 58fps today.
And I don't think it's biased against Tegra. Tegra performs pretty much there where it should be considering its age, and corresponds to it's specs.
And just to prove dismiss your point that Tegra gets a different codepath, I ran Electopia Bench again via Chainfire3D using the NVIDIA GL wrapper plugin emulating said device and I'm still getting the same amount of FPS.
If what you're saying is that it's not utilizing Tegra's full potential through proprietary Nvidia OpenGL extensions, might as well pack the bag and leave because then that logic would apply to pretty much every graphics core since it's not optimized for it. What we see here in these benchmarks is a plain simple ES 2.0 codepath which all devices should support and so we can do an oranges to oranges comparision. It's also one of the heaviest fragment-shader dependent benchmarks out there for the moment, and less geometry and texture bound, and that's why it runs so badly on pretty much every chip, since they don't get this type of workload in other benchmarks. This is also why the Mali gets such high FPS as that's where the quad GPU setup in the Exynos can shine.
AndreiLux said:
I'd hardly say it's biased to any one specific manufacturer based on these benchmarks:
More so I ran it myself with the latest firmware at stock frequencies (SGS2 btw ) and got:
Average FPS: 51.44
Time: 60.02
Number of frames: 3087
Trianglecount: 48827
Peak trianglecount: 68868
Quite funny difference to any other device I might say.
Click to expand...
Click to collapse
It's clearly MALI 400 in SGS2 is most powerful GPU right now. There is a 60fps limit on Galaxy S2, so you'll need a powerful benchmark. You can also see that in Nenamark2 too. SGS2=47fps, G2X=28fps, SGS=24fps

Samsung Exynos 5250 - Arndale development board.

For all those interested in developing for the Exynos 5250, to be used in the Nexus 10, Samsung have kindly launched, for a modest sum, the Arndale development board.
http://www.arndaleboard.org/wiki/index.php/Main_Page
It has already been benchmarked on the GL Benchmark site, Mali T-604 is powerful, but it doesn't look like it will give the A6X any headaches.
http://www.glbenchmark.com/phonedet...o25&D=Samsung+Arndale+Board&testgroup=overall
Need proper benchmark done on a final device. Definitely can't think that dev board drivers are optimized properly. It's running on 4.0.4. We should get more details once we do a benchmark on a final version of N10.
hot_spare said:
Need proper benchmark done on a final device. Definitely can't think that dev board drivers are optimized properly. It's running on 4.0.4. We should get more details once we do a benchmark on a final version of N10.
Click to expand...
Click to collapse
Jelly Bean didn't do much for graphics benchmarks, IRC. The low-level test won't change much, if you do a comparison with the iPad 3, you can see that Power VR SGX MP4 is a beast in terms of pixel / texture fill rate, which the A6X will improve further. The consensus is that shader power is the most important, as long as there is sufficient fill rate performance, and the Mali T-604 combined with its good bandwidth should be as capable as the A6X in real world games, the only question will developers optimise a game just 1 tablet?
Turbotab said:
Jelly Bean didn't do much for graphics benchmarks, IRC. The low-level test won't change much, if you do a comparison with the iPad 3, you can see that Power VR SGX MP4 is a beast in terms of pixel / texture fill rate, which the A6X will improve further. The consensus is that shader power is the most important, as long as there is sufficient fill rate performance, and the Mali T-604 combined with its good bandwidth should be as capable as the A6X in real world games, the only question will developers optimise a game just 1 tablet?
Click to expand...
Click to collapse
I am not saying that JB will suddenly improve GPU benchmarks, but a lot of improvement can happen due to driver/firmware optimization.
Let me give you real example: Do you recall GLbenchmark Egypt offscreen scores GS2 when it came out initially? It was getting around 40-42fps initially.
[Source: http://www.anandtech.com/show/4686/samsung-galaxy-s-2-international-review-the-best-redefined/17 ]
The same GS2 after a few months was getting 60-65fps under same test.
Source 1: http://www.anandtech.com/show/6022/samsung-galaxy-s-iii-review-att-and-tmobile-usa-variants/4
Source 2: http://www.anandtech.com/show/5811/samsung-galaxy-s-iii-preview
It's a clear 50% improvement in performance done primarily through driver optimization.
Also, check the fill rate properly in the Arndale board test. It's much less than what is expected. ARM says that Mali-T604 clocked at 500MHz should get a fill rate of 2 GPixels/s. It's actually showing just about 60% of what it should be delivering.
http://blogs.arm.com/multimedia/353-of-philosophy-and-when-is-a-pixel-not-a-pixel/
Also check this slide : http://semiaccurate.com/assets/uploads/2012/03/Samsung_Exynos_5_Mali.jpg
Samsung says 2.1 GPixels/s @ GPU clocked at 533MHz. Obviously the results don't match with quoted numbers. Difference is a lot actually.
I believe the final Nexus 10 numbers will be quite different from what we see now. Let's wait for final production models.

Categories

Resources