Related
Looking back, when I switch phones it is usually when there is a better device out with a significant improvement over my current device. My first smartphone was the Tmobile MDA (HTC Wizard), which I bought roughly 5 years ago. The next phone was the Tmobile Wing (HTC Atlas), with a much smaller form factor and faster CPU the device was a great improvement.
My next device was my first real HTC marketed phone, the Touch Diamond. The diamond, was a complete overhaul from the other two HTC phones I used. I loved every little part of it. But going from the Diamond to the glamorous HD2 was even more amazing, the screen, the size everything was perfect.
Now the question I have is that it is almost a year that the HD2 has been out and I ready to get a new phone, but I am wondering about what things I should consider.
I dont think that the Droid X, or the Galaxy S smart phones are really all that much better than the HD2, so I am more interested in the Cortex-A9 phones that are slowly trickling into the market.
The CPUs that will have Cortex-A9 dual core tech are as follows:
Nvidia
Tegra 2
1Ghz
Custom High Profile Graphics
(Motorola Olympus, LG Star)
Qaulcomm
Snapdragon 3rd Gen
1.2GHz/1.5GHz
Adreno 220
Verizon HTC Phone
Samsung
Orion
1GHz
Mali 400
(Nexus S)
Texas Instruments
OMAP 4
1GHz+
PowerVR SGX 540
(Pandaboard)
Marvell
Armada 628
1.5GHz + Custom 624MHz DSP
Custom High Profile Graphics
ST-Erricson
U8500
1.2GHz
Mali 400
So basically what should I do? Wait for all of them to come out and then decide, or get which one comes first.
I want the best processing power with the greatest graphics, and was thinking on Tegra 2 but found that Open GL ES benchmarks have low values for the Tergra2 platform lower than the SGX 540.
Galaxy Tab Results:
http://www.glbenchmark.com/phonedetails.jsp?D=Samsung GT-P1000 Galaxy Tab&benchmark=glpro11
Folio 100:
http://www.glbenchmark.com/phonedetails.jsp?D=Toshiba Folio 100&benchmark=glpro11
Are these a result of poor drivers or is Tegra really weaker than the SGX 540, (and thus weaker than the Mali 400)?????
Is the Nexus S a better choice than the Motorola Olympus, or should I wait for HTC's addition to the game with a 3rd gen Snappy. Will the adreno 220 GPU out power the Tegra 2 and Mali 400. What do you guys think, and what do you plan on doing.
Well firstly better hardware means nothing if the software is the bottleneck. Secondly, we've seen often the grunt of the cpu is more contributive to performance of programs than the gpu in Android OS. Thirdly, you're going to have to wait, see, buy, test these platforms to know which ones are superior... but here is what I've discovered during the course of 2010.
SoC's for 2011:
(listed in what I believe is the best to the worse)
+ ARM Sparrow: Dual-core Cortex A9 @2.00GHz (on 32nm die), unspecified GPU
+ TI OMAP 4440: Dual-core Cortex A9 @1.5GHz, SGX 540 (90M t/s)
+ Apple A5 (iPad2): Dual-core Cortex A9 @0.9GHz, SGX 543MP2 (130M-150M t/s)
+ Qualcomm MSM8660 (Gen IV Snapdragon): Dual-core Cortex A9 @1.5GHz, Adreno 220 (88M t/s)
+ TI OMAP 4430: Dual-core Cortex A9 @1GHz, SGX 540 (90M t/s)
+ ST-Ericson U8500: Dual-core Cortex A9 @1.2GHz, ARM Mali 400 (50-80M t/s)
+ Samsung Orion: Dual-core Cortex A9 @1GHz, ARM Mali 400 (50-80M t/s)
+ Nvidia Tegra 2: Dual-core Cortex A9 @1GHz, nVidia ULP-GeForce (71M t/s)
+ Qualcomm Scorpion (Gen III Snapdragon): Dual-core Cortex A8 @1.2GHz, Adreno 220 (88M t/s)
Notes: The SGX530 is roughly half the speed as the SGX535. The SGX540 is twice as fast as the SGX535. The Adreno 205(41M tri/sec) is supposedly faster than the SGX535 but slower than the SGX540 (thus, is likely to be in the mid). The Adreno 220 is twice the speed of the Adreno 205 but it is slightly slower than SGX540(88M vs 90M tri/sec). Samsung claims ARM Mali 400 to be 5 times faster than its previous GPU (S3C6410 - 4M tri/sec), about on par (80M tri/sec) with the Adreno 220, but few leaks benchmarked it to be only slighlty faster than the SGX535 (40M tri/sec). The gpu used in the Nvidia Tegra2 has been quite contained (little known). I estimated the Tegra2 has 71M t/sec (Tegra 2 Neocore=27fps/55fps=Galaxy S Neocore, x62% disadvantage of screen resolution, x 90Mt/s of SGX540 = 71M t/s). And recently some inside rumors via fudzilla actually confirmed this exact figure, so therefore the gpu-chip inside the Tegra2 is roughly equivalent to the MALI 400.
All of these details are based on officially announced, rumors from trustworthy sources and logical estimations, so discrepancies can be existent.
Last thoughts: As you can see there is some diversity in the next-gen chips (soon to-be current-gen), where the top tier (OMAP 4440) is roughly 1.5 times more powerful than the low tier (Tegra 2). However drivers and software will play a lead-role in determining which device could squeeze out the most performance. And this factor may alone favour the iPad2, Playbook or even MeeGo tablets to be better than the Honeycomb tablets which are somewhat bottleneck-ed by the lack of hardware accelaration and post-transcription through the Dalvik VM. I think we've hit the point where we could have some really impressive high definition entertainment, and even emulating the Dreamcast at decent/fullspeed.
edit2: Well, Apple's been boasting over x9 the graphical performance over the original iPad. There are 2 articles on anadtech, one in Geekbench and a processor-specific details from imgtech (I dug up from 12months ago). It has been found that its a modified Cortex A9, 512MB RAM and the SGX543MP2. Everything points to the SGX543MP2 being significantly faster than the SGX540, and the given number was 133 Million Polygons per second (theoretical) for SGX543MP4 which is double SGX543MP2 performance. The practical figure is always less. Imgtech said the SGX540 is double the grunt of the SGX535, benchmarks show the SGX543MP2 is (on average) five times the grunt as the iPad (SGX535). So going by imgtech (the designer of sgx chips), the theoretical value that I list above, should be 70M t/s ... going by Apple's claim it should be 200M t/s ... going by benchmarks it should be roughly 130 M t/s. Imgtech's value is definently wrong since they claimed its faster than the SGX540 valued at 90M t/s. Apple's claim also seems biased, they take only the best possible conditions and exaggerate it even more. It seems to be somewhere in between, and wouldn't you know it, the average of the two "false" claims is equivalent to the benchmarked value
edit3: The benchmarks are out for the 4th-gen QSD, which confirms everything prior. It's competing for top place against the 4440 and A5. I've changed the post (only updated chip's name).
If one were to choose between the processor of the A5 and the OMAP4440, they'd be really pressed to choose between more cpu grunt or more gpu grunt.
Just re-edited the post.
Apple's A5 details are added in, its looks to be one of the best chips for the year.
If I had to choose between the OMAP4440 and A5, I probably would be reduced to a head-tail coin flip!
Update:
The benchmark results of the Snapdragon MSM8660 are in.... and it goes further to support the list.
MSM660 = Dualcore A9 + Adreno 220 + Qualcomm modification (for better/worse).
Samsung have the luxury of making their own chips and they are quick to put out new and better versions of them. The Exynos chipset, which debuted with the Samsung Galaxy S II at a 'mere' 1.2GHz is getting a 1.5GHz version, called the Exynos 4212.
Samsung also has a pair of high-end mobile cameras headed for the production line. One is a 16MP main shooter with a back illuminated sensor for better low-light performance (expected to ship in November) and the other is a 1.2MP module with [email protected] capture capabilities for front-facing cameras.
We can't quite make out the Google-translated press release but it seems the front facing camera will have 1/8.2 sensor (that sounds pretty small, but we'll see) and the ISO of the main shooter goes up to 1,600.
Going back to Exynos, it's built using the 32nm process and was designed with 3D performance in mind. Gameloft is apparently showing interest and will offer several titles to put the new SoC to good work.
The Korea-bound Galaxy S II LTE and Galaxy S II HD LTE will sport Exynos chipsets with the CPU clocked at 1.5GHz, which makes them the most likely candidates for being the first phones with the new chipset.
Samsung already has a 1.4GHz version of Exynos that's powering the Galaxy Note and the Galaxy Tab 7.7, but there's no info what kind of change in performance we can expect in the 3D department (beyond the obvious gain from the faster clock speed).
Click to expand...
Click to collapse
http://www.gsmarena.com/samsung_announces_15ghz_exynos_chipset_16mp_camera-news-3200.php
FWAP FWAP FWAP at the highlighted bits. One thing Samsung is perfectly good at making is chips.
I think this might mean I'll give the Nexus Prime a miss and wait for the Galaxy S III. Probably jizz in my pants when I hold it.
Should be epic, loving the Samsung processors. iPhone5 should be feeling scared. Hopefully ICS will be a hit as well.
Let's see what Apple can do and whether it tempts away current galaxy users.
Sent from my GT-I9100 using XDA App
http://pocketnow.com/android/samsung-unveils-new-dual-core-exynos-4212-processor
50% increase in 3D performance. Something to rival the A5 perhaps?
I hope they put this in the Galaxy Note.
Killer Bee said:
http://pocketnow.com/android/samsung-unveils-new-dual-core-exynos-4212-processor
50% increase in 3D performance. Something to rival the A5 perhaps?
I hope they put this in the Galaxy Note.
Click to expand...
Click to collapse
The SGX 543mp2 is a lot faster than 50% compared to the Mali400. Really hoping for Kal-El in the SGS3.
Killer Bee said:
http://pocketnow.com/android/samsung-unveils-new-dual-core-exynos-4212-processor
50% increase in 3D performance. Something to rival the A5 perhaps?
I hope they put this in the Galaxy Note.
Click to expand...
Click to collapse
Something to crash the A5 for sure.The Galaxy S II's Exynos 4210 rivals the A5.
Toss3 said:
The SGX 543mp2 is a lot faster than 50% compared to the Mali400. Really hoping for Kal-El in the SGS3.
Click to expand...
Click to collapse
W-wa-wait.What?****in' ****.Where did you see this?Go check the Anandtech review.It shows the high-end mobile GPUs' performance in cold numbers and shows technical specs clearly.The 543MP2 might be 50% faster than the Adreno 220,not the Mali MP-400.
Killer Bee said:
http://pocketnow.com/android/samsung-unveils-new-dual-core-exynos-4212-processor
50% increase in 3D performance. Something to rival the A5 perhaps?
I hope they put this in the Galaxy Note.
Click to expand...
Click to collapse
I believe this is a simple die shrink from 45 to 32nm. So it still has the Mali-400 GPU but clocked at 400MHz instead of 267MHz (i e 50% clock increase). The very subtle name change from Exynos 4210 to Exynos 4212 almost confirms this.
Until the Mali-400 can compete with the SGX543MP2 in GLBenchmark at the same resolution, I'm going to wait for a Kal-El phone.
tolis626 said:
Something to crash the A5 for sure.The Galaxy S II's Exynos 4210 rivals the A5.
W-wa-wait.What?****in' ****.Where did you see this?Go check the Anandtech review.It shows the high-end mobile GPUs' performance in cold numbers and shows technical specs clearly.The 543MP2 might be 50% faster than the Adreno 220,not the Mali MP-400.
Click to expand...
Click to collapse
http://www.anandtech.com/show/4686/samsung-galaxy-s-2-international-review-the-best-redefined/17
Check the GLBenchmark numbers again; the SGX 543mp2 is over 100% faster than the Mali mp-400.
Toss3 said:
Check the GLBenchmark numbers again; the SGX 543mp2 is over 100% faster than the Mali mp-400.
Click to expand...
Click to collapse
And it's driving over twice the resolution. The GPU in the iPhone 5 is going to blow away the competition, as it's at a lower resolution compared to the iPad 2. I'm actually kind of jealous. I refuse to use a device with a lower performing GPU than an Apple product! Hope the GPU in the Kal-El will actually be competitive.
YOUCANNOTDENY said:
And it's driving over twice the resolution. The GPU in the iPhone 5 is going to blow away the competition, as it's at a lower resolution compared to the iPad 2. I'm actually kind of jealous. I refuse to use a device with a lower performing GPU than an Apple product! Hope the GPU in the Kal-El will actually be competitive.
Click to expand...
Click to collapse
No it is not; both were running the benchmarks at 1280x720. Kal-El is a lot faster than both the Mali MP-400 and SGX 543mp2 while having a lot more features let alone having five cores.
"GLBenchmark 2.1 now includes the ability to render the test offscreen at a resolution of 1280 x 720. This is not as desirable as being able to set custom resolutions since it's a bit too high for smartphones but it's better than nothing." Anandtech
EDIT: Apple might decide to cut back the A5 running inside the iPhone 5 to just one 543, or a lower clocked version, as having two doesn't really make any sense on a mobile phone at least when battery life is taken into consideration.
tjtj4444 said:
I believe this is a simple die shrink from 45 to 32nm. So it still has the Mali-400 GPU but clocked at 400MHz instead of 267MHz (i e 50% clock increase). The very subtle name change from Exynos 4210 to Exynos 4212 almost confirms this.
Click to expand...
Click to collapse
This sounds plausible.
YOUCANNOTDENY said:
And it's driving over twice the resolution. The GPU in the iPhone 5 is going to blow away the competition, as it's at a lower resolution compared to the iPad 2. I'm actually kind of jealous. I refuse to use a device with a lower performing GPU than an Apple product! Hope the GPU in the Kal-El will actually be competitive.
Click to expand...
Click to collapse
There is a developer tablet called ODROID-A that uses the Exynos 4210 SoC and it benches pretty well with an even higher resolution than the iPad 2 (1366x768 vs 1024x768).
Comparison for reference.
With a 50% increase, the Mali-400 (assuming they keep this GPU) will be comparable to the SGX-543MP2.
Toss3 said:
http://www.anandtech.com/show/4686/samsung-galaxy-s-2-international-review-the-best-redefined/17
Check the GLBenchmark numbers again; the SGX 543mp2 is over 100% faster than the Mali mp-400.
Click to expand...
Click to collapse
Well,for all I know,the CPU in the iPad 2 is larger(The chip itself is larger),which instantly translates in more transistors being placed in the same space.I can only suspect that same goes for the GPU.But even if it does not,it's most likely that the version in the iPhone 5/4s/whatever won't be the same.Will it be underclocked?Smaller?I don't know.
By the way,I personally believe manufacturers **** all over their tablets when they put phone SoC's in them.There should be a different,more powerful(albeit more power-hungry) variant of each SoC for tablets.But that's just me.
As for the iPhone part,don't mistake me as a hater(I hate it btw,but I won't flame it or anything.If it's better than what I have,I'll just admit it.It's not the hardware I hate).In fact I wish it's THAT powerful,so that the competition will drive performance up for everyone.And that,for us,is a win.
Seems with every smartphone that comes to the USA it gets some sort of Snapdragon Processor by Qualcomm and people do nothing but complain. So how does this Snapdragon S4 processor compare to every other dual-core processor out there and even the Tegra 3? Looked up some benchmarks and both seem to have their advantages and disadvantages. But what I really want to know is which one is better for real world performance, such as battery life, transitional effects, and launching apps. Couple people said Sense 4 is very smooth and "has LITTLE to no lag"? How does this processor display web pages in Chrome?
Read the thread "Those of your who are waiting too compare GSIII to HTC One X" in this forum. It only has about 6 pages but has a ton of information. Short answer is that the Qualcomm chip kicks serious ass.
Sent from my Desire HD using XDA
shaboobla said:
Short answer is that the Qualcomm chip kicks serious ass.
Sent from my Desire HD using XDA
Click to expand...
Click to collapse
+1
After reading through that thread I'm still not entirely clear. Seems the Tegra is better for gaming?
MattMJB0188 said:
After reading through that thread I'm still not entirely clear. Seems the Tegra is better for gaming?
Click to expand...
Click to collapse
yes and no, the tegra 3 does have a better gpu so in theory, better games. however, game makers cater to the mass. most androids that are active are mid-range, android 2.2 or 2.3, have a resolution of 480x800, and last years (or older) processors. although most will be made to work on the t3 and s4, it will be compatibility issues, not optimization. nvidia will have a couple games "t3 only" but even those will be made to work on other phones. now that ics is cleaning up some of the splintering of apps, we'll see some better options on both fields.
in short, yes the t3 is a better gaming chip. but for the battery life, games available, and current bugs i would suggest the s4. i may change my mind when the refreshs come out q3-4, we'll see.
MattMJB0188 said:
After reading through that thread I'm still not entirely clear. Seems the Tegra is better for gaming?
Click to expand...
Click to collapse
Correct. However, most games are not optimized to utilize the Tegra to its fullest potential. That should change by the end of the year. The other point is that the S4 is just as good as the Tegra un terms of gaming performance. IMO, you should decide between these 2 processors by looking at the main area where the S4 truly has the advantage thus far, and that is battery life. So far, the battery life advantage goes to the S4. Just read the battery life threads in this forum and for the international X. It took a few updates to the Transformer Prime to start having pretty good battery life. The One X, will get better in that department with a couple more updates for battery optimization. The S4 starts with great battery life and will get even better in that department.
Sent from my HTC Vivid using XDA app
I say the snapdragon S4 is a better chip right now. The tegra 3 gpu is great and with the tegra zone games it really looks great. But he 4 cores CPU is really for heavy multitasking so you candivise the work between all four cores. They are A9 cores vs the custom qualcomm which is close to A15. It mans that for single threaded task and multi threaded task the snapdragon will whoop tegra 3' ass. Opening an app, scrolling through that app sect... also browser performance is slightly better on the qualcomm chip. Basically tegra 3 can do lots of things at the same time with decent speed vs the S4 chip which can do 1 or few more things at lighting speed.
The S4 is almost 2x faster than any other dual core out there. Anandtech did a few nice articles on the S4, including benchmarks vs tegra 3.
In real use, the S4 should be much better, because not all apps are multithreaded for 4 cores. The S4 completely kicks the Tegra 3's ass in singlethreaded benchmarks. I also expect the S4 to be better at power management, because it is made on 28nm node, instead of 40 nm, so its more compact and efficient.
About 23 I'd say
Sent from my SGH-I997 using xda premium
Here is a comparison benchmark by someone from Reddit.
Benchmark S4 Krait Tegra 3
Quadrant 5016 4906
Linpack Single 103.11 48.54
Linpack Multi 212.96 150.54
Nenamark 2 59.7fps 47.6fps
Nenamark 1 59.9fps 59.5fps
Vellamo 2276 1617
SunSpider 1540.0ms 1772.5ms
Sadly, can't do much for the formatting. Enjoy.
The difference in DMIP's is where the S4 really whomps on the T3. All the T3 has going for it at the moment is it's GPU. If you don't care about some additional gaming prowess, the S4 is the way to go.
tehdef said:
Here is a comparison benchmark by someone from Reddit.
Benchmark S4 Krait Tegra 3
Quadrant 5016 4906
Linpack Single 103.11 48.54
Linpack Multi 212.96 150.54
Nenamark 2 59.7fps 47.6fps
Nenamark 1 59.9fps 59.5fps
Vellamo 2276 1617
SunSpider 1540.0ms 1772.5ms
Sadly, can't do much for the formatting. Enjoy.
The difference in DMIP's is where the S4 really whomps on the T3. All the T3 has going for it at the moment is it's GPU. If you don't care about some additional gaming prowess, the S4 is the way to go.
Click to expand...
Click to collapse
Just to add to that and to be fair, S4 is at around 7000 at antutu benchmark while tegra 3 is at around 10000. I still prefer the S4
Eh...
It wins in 1 benchmark specifically enabled to take advantage of more than 2 cores. So if you want to play tegrazone games and have some basic lag, the T3 is for you. If you want to have a near flawless phone experience, and have decreased graphical performance in some wanna be console games, then the S4 is the way to go.
Actually you wont really notice the lack of graphics performance on the snapdragon s4. Its about 10% slower in most benchmarks but outperforms the tegra3 in a few as well. However i have a sensation xl with the adreno 205 which is only a quarter as fast as the adreno 225 and all games including deadspace, frontline, blood glory runs smoothly on it. To say the snapdragon s4 is inferior because of the slower Adreno 225 is really nit picking to me. For me bigger reason to choose one graphics chip over another is flash performance and this is where the exynos mali 400 kicks the adreno 225 in the balls. It handles 1080p youtube videos in browser without a hiccup while the 225 chokes even on 720p content.
Let me answer this. How good is it? More than good enough. Almost all apps and games are catered to weaker phones so the T3 and S4 are both more than good enough.
And my two cents, the S4 beats tegra 3
MattMJB0188 said:
Seems with every smartphone that comes to the USA it gets some sort of Snapdragon Processor by Qualcomm and people do nothing but complain. So how does this Snapdragon S4 processor compare to every other dual-core processor out there and even the Tegra 3? Looked up some benchmarks and both seem to have their advantages and disadvantages. But what I really want to know is which one is better for real world performance, such as battery life, transitional effects, and launching apps. Couple people said Sense 4 is very smooth and "has LITTLE to no lag"? How does this processor display web pages in Chrome?
Click to expand...
Click to collapse
Let me start by saying I'm not a pro when it comes to electronics but I do have an understanding on the subject.
The thing to realize about these processors, and most other processors available today, is that the s4 is based on the cortex a15 while the tegra 3 along with the new Samsung are based on the a9. The a15, at the same Hz and die size is 40% faster than the a9.
S4 = dual core Cortex A15 @ 1.5GHz - 28NM
Tegra3 = quad core Cortex A9 @ 1.5GHz - 40NM
Exynos 4(Samsung) = quad core Cortex A9 @ 1.5GHz - 32NM
S4 so far, in theory, is 40% faster per core, but having two less. Individual apps will run faster unless they utilize all four cores on the tegra3. Because the s4 has a smaller die size, it will consume less energy per core.
The actual technology behind these chips that the manufacturers come up with will also affect the performance output, but the general idea is there. Hope that helps to understand a little better how the two chips will differ in performance.
Sent from my shiny One XL
The S4 compared to the Tegra3 says it all. dualcore that beats a quadcore in almost everything.
Intel released the first native dual core processor in 2006 and shortly thereafter released a quad core which was basically two dual cores fused together (this is what current ARM quads are like).
That was 6 years ago and these days pretty much all new desktop computers come with quad cores while laptops mostly stick with dual. Laptops make up the biggest share of PC sales so for your everyday PC usage, you'll be more than comfortable with a dual core.
You really can't assume mobile SoCs will follow the same path, but it's definitely something to consider. I think dual core A15-based SoCs will still rule the day this year and next at the very least.
I was really on the fence about the X or the XL. But the S4 got me. Not having 32GB is already bugging me. But the efficiency (and my grandfathered unlimited data paired with Google Music) is definitely worth the sacrifice. Very happy so far! Streaming Slacker, while connected to my A2DP stereo, running GPS was great. I'm not a huge gamer though. I miss Super Mario Bros being the hottest thing!
krepler said:
Let me start by saying I'm not a pro when it comes to electronics but I do have an understanding on the subject.
The thing to realize about these processors, and most other processors available today, is that the s4 is based on the cortex a15 while the tegra 3 along with the new Samsung are based on the a9. The a15, at the same Hz and die size is 40% faster than the a9.
S4 = dual core Cortex A15 @ 1.5GHz - 28NM
Tegra3 = quad core Cortex A9 @ 1.5GHz - 40NM
Exynos 4(Samsung) = quad core Cortex A9 @ 1.5GHz - 32NM
S4 so far, in theory, is 40% faster per core, but having two less. Individual apps will run faster unless they utilize all four cores on the tegra3. Because the s4 has a smaller die size, it will consume less energy per core.
The actual technology behind these chips that the manufacturers come up with will also affect the performance output, but the general idea is there. Hope that helps to understand a little better how the two chips will differ in performance.
Sent from my shiny One XL
Click to expand...
Click to collapse
correct me if im wrong but all 3 are A9 based including the S4. the first A15 will be the Exynos 5250, a dual core.
Tankmetal said:
correct me if im wrong but all 3 are A9 based including the S4. the first A15 will be the Exynos 5250, a dual core.
Click to expand...
Click to collapse
This is inaccurate.
The Exynos 4 and the Tegra 3 are based on the ARM A9 reference design.
The Qualcomm Snapdragon S4 is "roughly equivalent" to the A15, but not based on the A15. The same was true for Qualcomm's old S3 (which was equivalent to something between the A8 and A9 design)
One thing that most people don't realize is that Qualcomm is one of the very few companies that designs its own processors based on the ARM instruction set, and while S4's is similar to the A15 in terms of architecture, it's actually arguably better than the ARM reference design (e.g. asynchronous clocking of each core which is a better design than the big.LITTLE or +1 design).
Hi guys, I'm currently torn between a few phones,
can you help me?
My budget is up to 300 euros (MAX), I can purchase the phone from either Netherlands or Germany, does anyone know any good shops there btw?
Anyway,
My question is which is better, the LG Optimus 2X (299 Euro), Xperia Sola (279 Euro) or Xperia Arc S (279 Euro)
I'm pretty much looking for the best gaming performance, the LG has Tegra2, the Sola has Mali-400MP 1 Core, and the Arc S has Adreno 205,
There is no comparison between these GPUs, I've searched for hours, and also there's no comparison of the Mali-400 single core to quad core variant anywhere!
Also, there's no definitive comparison between a Dual Core 1GHZ Cortex A9 to a single core 1.4GHZ Snapdragon Scorpion CPU!
This is what I've got so far, only the stuff I care about in the phone (for example, I don't care about the camera or whether its TFT or AMOLED).
1. LG Optimus 2X
-Dual core 1.0ghz and Tegra2 Seems to be the best. (Good)
-4.0" Screen (Good)
-Battery performance seems to be in the gutter ~1500mAh. (Bad)
-More Expensive
2. Xperia Sola
-Dual core and Mali-400MP1 (Good or bad?)
-Battery built into phone, can't replace it or get a replacement battery, 1320mAh. (Bad)
-3.7" Screen (Average)
-Coolness factor - Hover touch and cool Tags, NFC Capable
3. Arc S
-Single Core 1.4GHZ (Good or bad?)
-Adreno 205 (Good or bad?)
-4.2" Screen (Good)
-Average Battery life
Thanks in advance!!!
I would go with the LG Optimus 2X. But, I'm no expert. I would be interested in what others think.
I would completely eliminate the xperia arc S b/c it is a single core and that it has an outdated adreno 205 gpu. That means you will get lag and not be very optimal for gaming.
That said, the top gpu in your choices is most definitely the Mali 400 in the Xperia Sola. It is better than the Tegra 2 in almost all aspects. Tegra 2 also has one major drawback and that it does not have NEON support. NEON is basically a video optimization software built into the gpu, but Tegra does not have it, so some video formats cannot play properly or not at all.
As for comparisons, here is one: http://www.anandtech.com/show/4686/samsung-galaxy-s-2-international-review-the-best-redefined/16
The Tegra 2 is labeled as the ULP GeForce and you will see that the Mali 400 blows it out of the water.
Also on a side note, there are many hardware problems with the LG Optimus 2X, like random reboots or bad signal, so overall, in terms of build quality, cpu and graphics, go for a xperia sola.
vx117 said:
I would completely eliminate the xperia arc S b/c it is a single core and that it has an outdated adreno 205 gpu. That means you will get lag and not be very optimal for gaming.
That said, the top gpu in your choices is most definitely the Mali 400 in the Xperia Sola. It is better than the Tegra 2 in almost all aspects. Tegra 2 also has one major drawback and that it does not have NEON support. NEON is basically a video optimization software built into the gpu, but Tegra does not have it, so some video formats cannot play properly or not at all.
As for comparisons, here is one: http://www.anandtech.com/show/4686/samsung-galaxy-s-2-international-review-the-best-redefined/16
The Tegra 2 is labeled as the ULP GeForce and you will see that the Mali 400 blows it out of the water.
Also on a side note, there are many hardware problems with the LG Optimus 2X, like random reboots or bad signal, so overall, in terms of build quality, cpu and graphics, go for a xperia sola.
Click to expand...
Click to collapse
I was basing my recommendation on the Mali GPU in the Galaxy S2: http://www.anandtech.com/show/4177/samsungs-galaxy-s-ii-preliminary-performance-mali400-benchmarked.
Also, this article suggests the Tegra 2 is better: http://www.differencebetween.com/difference-between-mali-400mp-gpu-and-vs-tegra-2/.
But, as I said, I know very little about these GPUs and phones.
Python. said:
I was basing my recommendation on the Mali GPU in the Galaxy S2: http://www.anandtech.com/show/4177/samsungs-galaxy-s-ii-preliminary-performance-mali400-benchmarked.
Also, this article suggests the Tegra 2 is better: http://www.differencebetween.com/difference-between-mali-400mp-gpu-and-vs-tegra-2/.
But, as I said, I know very little about these GPUs and phones.
Click to expand...
Click to collapse
Well, most articles favor Mali 400 MP. Both are strong gpu's, but the lack of NEON video optimization is a deal breaker.
vx117 said:
Well, most articles favor Mali 400 MP. Both are strong gpu's, but the lack of NEON video optimization is a deal breaker.
Click to expand...
Click to collapse
The Mali is a good gpu but you should know that the sola has a Mali 400 mp1
that means it's singlecore but the gpu of the s2 is the Mali 400mp4 it's the quadcore version:banghead:
Sent from my GT-I9300 using xda premium
oh yeah, you're right, that is a Mali 400 MP1. I wasn't aware of that. That really sucks then.
If thats the case, then go with the LG Optimus 2X. The Tegra 2 has a 8 core gpu. That's definitely better than a single core gpu anyday.
Sorry for the confusion.
Galaxy s3
Sent from my LT15i using xda premium
emilfadillah said:
Galaxy s3
Sent from my LT15i using xda premium
Click to expand...
Click to collapse
The Galaxy S3, although a great phone, is 500 euros which is far beyond the OP's budget.
Since you'll be gaming, the most powerful gpu and the longest battery life matter most. LG wins both counts.Get the LG,flash a barebones ROM that frees up ram and increases battery life,and game away.
Sent from my U8150 using XDA
LG released a fix for the restarting bug I found on the net.
It plays H264 MKV files no? are the missing codecs that much of a problem?
Thanks for the help guys!!
But just last thing, 1.4GHZ single core VS 1.0GHZ dual core = dual core winner?
Mali-400MP 1 Core VS Adreno 205 = Mali winner?
opala said:
LG released a fix for the restarting bug I found on the net.
It plays H264 MKV files no? are the missing codecs that much of a problem?
Thanks for the help guys!!
But just last thing, 1.4GHZ single core VS 1.0GHZ dual core = dual core winner?
Mali-400MP 1 Core VS Adreno 205 = Mali winner?
Click to expand...
Click to collapse
Arc S > Sola in all areas.
Mali 400mp1 < adreno 205 it's just that adreno is aging.
Sent from my U8150 using XDA
Who has been excited by the Tegra 4 rumours?, last night's Nvidia CES announcement was good, but what we really want are cold-hard BENCHMARKS.
I found an interesting mention of Tegra T114 SoC on a Linux Kernel site, which I've never heard of. I got really interested when it stated that the SoC is based on ARM A15 MP, it must be Tegra 4. I checked the background of the person who posted the kernel patch, he is a senior Nvidia Kernel engineer based in Finland.
https://lkml.org/lkml/2012/12/20/99
"This patchset adds initial support for the NVIDIA's new Tegra 114
SoC (T114) based on the ARM Cortex-A15 MP. It has the minimal support
to allow the kernel to boot up into shell console. This can be used as
a basis for adding other device drivers for this SoC. Currently there
are 2 evaluation boards available, "Dalmore" and "Pluto"."
On the off chance I decided to search www.glbenchmark.com for the 2 board names, Dalmore (a tasty whisky!) and Pluto (Planet, Greek God and cartoon dog!) Pluto returned nothing, but Dalmore returned a device called 'Dalmore Dalmore' that was posted on 3rd January 2013. However the OP had already deleted them, but thanks to Google Cache I found the results
RESULTS
GL_VENDOR NVIDIA Corporation
GL_VERSION OpenGL ES 2.0 17.01235
GL_RENDERER NVIDIA Tegra
From System spec, It runs Android 4.2.1, a Min frequency of 51 MHz and Max of 1836 Mhz
Nvidia DALMORE
GLBenchmark 2.5 Egypt HD C24Z16 - Offscreen (1080p) : 32.6 fps
iPad 4
GLBenchmark 2.5 Egypt HD C24Z16 - Offscreen (1080p): 49.6 fps
CONCLUSION
Anandtech has posted that Tegra 4 doesn't use unified shaders, so it's not based on Kepler. I reckon that if Nvidia had a brand new GPU they would have shouted about it at CES, the results I've found indicate that Tegra 4 is between 1 to 3 times faster than Tegra 3.
BUT, this is not 100% guaranteed to be a Tegra 4 system, but the evidence is strong that it is a T4 development board. If this is correct, we have to figure that it is running beta drivers, Nexus 10 is ~ 10% faster than the Arndale dev board with the same Exynos 5250 SoC. Even if Tegra 4 gets better drivers, it seems like the SGX 544 MP4 in the A6X is still the faster GPU, with Tegra 4 and Mali T604 being an almost equal 2nd. Nvidia has said that T4 is faster than A6X, but the devil is in the detail, in CPU benchmarks I can see that being true, but not for graphics.
UPDATE - Just to add to the feeling that that this legit, the GLBenchmark - System section lists the "android.os.Build.USER" as buildbrain. Buildbrain according to a Nvidia job posting is "Buildbrain is a mission-critical, multi-tier distributed computing system that performs mobile builds and automated tests each day, enabling NVIDIA's high performance development teams across the globe to develop and deliver NVIDIA's mobile product line"
http://jobsearch.naukri.com/job-lis...INEER-Nvidia-Corporation--2-to-4-130812500024
I posted the webcache links to GLBenchmark pages below, if they disappear from cache, I've saved a copy of the webpages, which I can upload, Enjoy
GL BENCHMARK - High Level
http://webcache.googleusercontent.c...p?D=Dalmore+Dalmore+&cd=1&hl=en&ct=clnk&gl=uk
GL BENCHMARK - Low Level
http://webcache.googleusercontent.c...e&testgroup=lowlevel&cd=1&hl=en&ct=clnk&gl=uk
GL BENCHMARK - GL CONFIG
http://webcache.googleusercontent.c...Dalmore&testgroup=gl&cd=1&hl=en&ct=clnk&gl=uk
GL BENCHMARK - EGL CONFIG
http://webcache.googleusercontent.c...almore&testgroup=egl&cd=1&hl=en&ct=clnk&gl=uk
GL BENCHMARK - SYSTEM
http://webcache.googleusercontent.c...ore&testgroup=system&cd=1&hl=en&ct=clnk&gl=uk
OFFSCREEN RESULTS
http://webcache.googleusercontent.c...enchmark.com+dalmore&cd=4&hl=en&ct=clnk&gl=uk
http://www.anandtech.com/show/6550/...00-5th-core-is-a15-28nm-hpm-ue-category-3-lte
Is there any Gpu that could outperform iPad4 before iPad5 comes out? adreno 320, t Mali 604 now tegra 4 aren't near it. Qualcomm won't release anything till q4 I guess, and tegra 4 has released too only thing that is left is I guess is t Mali 658 coming with exynos 5450 (doubtfully when it would release, not sure it will be better )
Looks like apple will hold the crown in future too .
i9100g user said:
Is there any Gpu that could outperform iPad4 before iPad5 comes out? adreno 320, t Mali 604 now tegra 4 aren't near it. Qualcomm won't release anything till q4 I guess, and tegra 4 has released too only thing that is left is I guess is t Mali 658 coming with exynos 5450 (doubtfully when it would release, not sure it will be better )
Looks like apple will hold the crown in future too .
Click to expand...
Click to collapse
There was a great article on Anandtech that tested the power consumption of the Nexus 10's Exynos 5250 SoC, it showed that both the CPU and GPU had a TDP of 4W, making a theoretical SoC TDP of 8W. However when the GPU was being stressed by running a game, they ran a CPU benchmark in the background, the SoC quickly went up to 8W, but the CPU was quickly throttled from 1.7 GHz to just 800 Mhz as the system tried to keep everything at 4W or below, this explained why the Nexus 10 didn't benchmark as well as we wished.
Back to the 5450 which should beat the A6X, trouble is it has double the CPU & GPU cores of the 5250 and is clocked higher, even on a more advanced 28nm process, which will lower power consumption I feel that system will often be throttled because of power and maybe heat concerns, so it looks amazing on paper but may disappoint in reality, and a 5450 in smartphone is going to suffer even more.
So why does Apple have an advantage?, well basically money, for a start mapple fans will pay more for their devices, so they afford to design a big SoC and big batteries that may not be profitable to other companies. Tegra 4 is listed as a 80mm2 chip, iPhone 5 is 96mm2 and A6X is 123mm2, Apple can pack more transistors and reap the GPU performance lead, also they chosen graphics supplier Imagination Technologies have excellent products, Power VR Rogue will only increase Apple's GPU lead. They now have their own chip design team, the benefit for them has been their Swift core is almost as powerful as ARM A15, but seems less power hungry, anyway Apple seems to be happy running slower CPUs compared to Android. Until an Android or WP8 or somebody can achieve Apple's margins they will be able to 'buy' their way to GPU domination, as an Android fan it makes me sad:crying:
32fps is no go...lets hope it's not final
hamdir said:
32fps is no go...lets hope it's not final
Click to expand...
Click to collapse
It needs to, but it will be OK for a new Nexus 7
still faster enough for me, I dont game alot on my nexus 7.
I know I'm taking about phones here ... But the iPhone 5 GPU and adreno 320 are very closely matched
Sent from my Nexus 4 using Tapatalk 2
italia0101 said:
I know I'm taking about phones here ... But the iPhone 5 GPU and adreno 320 are very closely matched
Sent from my Nexus 4 using Tapatalk 2
Click to expand...
Click to collapse
From what I remember the iPhone 5 and the new iPad wiped the floor with Nexus 4 and 10. The ST-Ericsson Nova A9600 is likely to have a PowerVR Rogue GPU. Just can't wait!!
adityak28 said:
From what I remember the iPhone 5 and the new iPad wiped the floor with Nexus 4 and 10. The ST-Ericsson Nova A9600 is likely to have a PowerVR Rogue GPU. Just can't wait!!
Click to expand...
Click to collapse
That isn't true , check glbenchmark , in the off screen test the iPhone scored 91 , the nexus 4 scored 88 ... That ksnt wiping my floors
Sent from my Nexus 10 using Tapatalk HD
Its interesting how even though nvidia chips arent the best we still get the best game graphics because of superior optimization through tegra zone. Not even the a6x is as fully optimized.
Sent from my SAMSUNG-SGH-I727 using xda premium
ian1 said:
Its interesting how even though nvidia chips arent the best we still get the best game graphics because of superior optimization through tegra zone. Not even the a6x is as fully optimized.
Sent from my SAMSUNG-SGH-I727 using xda premium
Click to expand...
Click to collapse
What sort of 'optimisation' do you mean? un optimised games lag that's a big letdown and tegra effects can also be used on other phones too with chain fire 3d I use it and tegra games work without lag with effects and I don't have a tegra device
With a tegra device I am restricted to optimised games mostly
The graphic performance of NVIDIA SoCs is always disappointed, sadly for the VGA dominanting provider on the world.
The first Tegra2, the GPU is a little bit better than SGX540 of GalaxyS a little bit in benchmark, but lacking NEON support.
The second one Tegra 3, the GPU is nearly the same as the old Mali400Mp4 in GALAXY S2/Original Note.
And now it's better but still nothing special and outperformed soon (Adreno 330 and next-gen Mali)
Strongest PowerVR GPUs are always the best, but sadly they are exclusive for Apple only (SGX543 and maybe SGX 554 also, only Sony ,who has the cross-licencing with Apple, has it in PS Vita and in PS Vita only)
tegra optimization porting no longer works using chainfire, this is now a myth
did u manage to try shadowgun thd, zombie driver or horn? the answer is no, games that use t3 sdk for physx and other cpu graphics works can not be forced to work on other devices, equally chainfire is now outdated and no longer updated
now about power vr they are only better in real multicore configuration which is only used by apple and Sony's vita, eating large die area, ie actual multicore each with its own subcores/shaders, if tegra was used in real multi core it would destroy all
finally this is really funny all this doom n gloom because of an early discarded development board benchmark, I dont mean to take away from turbo's thunder and his find but truly its ridiculous the amount of negativity its is collecting before any type of final device benchs
adrena 220 doubled in performance after the ICS update on sensation
t3 doubled the speed of t2 gpu with only 50% the number of shaders so how on earth do you believe only 2x the t3 scores with 600% more shaders!!
do you have any idea how miserable the ps3 performed in its early days? even new desktop GeForces perform much less than expected until the drivers are updated
enough with the FUD! seems this board is full of it nowadays and so little reasoning...
For goodness sake, this isn't final hardware, anything could change. Hung2900 knows nothing, what he stated isn't true. Samsung has licensed PowerVR, it isn't just stuck to Apple, just that Samsung prefers using ARMs GPU solution. Another thing I dislike is how everyone is comparing a GPU in the iPad 4 (SGX554MP4) that will NEVER arrive in a phone compared a Tegra 4 which will arrive in a phone. If you check OP link the benchmark was posted on the 3rd of January with different results (18fps then 33fps), so there is a chance it'll rival the iPad 4. I love Tegra as Nvidia is pushing developers to make more better games for Android compared to the 'geeks' *cough* who prefers benchmark results, whats the point of having a powerful GPU if the OEM isn't pushing developers to create enhance effect games for there chip.
Hamdir is correct about the GPUs, if Tegra 3 was around 50-80% faster than Tegra 2 with just 4 more cores, I can't really imagine it only being 2x faster than Tegra 3. Plus its a 28nm (at around 80mm2 just a bit bigger than Tegra 3, smaller than A6 90mm2) along with the dual memory than single on Tegra 2/3.
Turbotab said:
There was a great article on Anandtech that tested the power consumption of the Nexus 10's Exynos 5250 SoC, it showed that both the CPU and GPU had a TDP of 4W, making a theoretical SoC TDP of 8W. However when the GPU was being stressed by running a game, they ran a CPU benchmark in the background, the SoC quickly went up to 8W, but the CPU was quickly throttled from 1.7 GHz to just 800 Mhz as the system tried to keep everything at 4W or below, this explained why the Nexus 10 didn't benchmark as well as we wished.
Back to the 5450 which should beat the A6X, trouble is it has double the CPU & GPU cores of the 5250 and is clocked higher, even on a more advanced 28nm process, which will lower power consumption I feel that system will often be throttled because of power and maybe heat concerns, so it looks amazing on paper but may disappoint in reality, and a 5450 in smartphone is going to suffer even more.
So why does Apple have an advantage?, well basically money, for a start iSheep will pay more for their devices, so they afford to design a big SoC and big batteries that may not be profitable to other companies. Tegra 4 is listed as a 80mm2 chip, iPhone 5 is 96mm2 and A6X is 123mm2, Apple can pack more transistors and reap the GPU performance lead, also they chosen graphics supplier Imagination Technologies have excellent products, Power VR Rogue will only increase Apple's GPU lead. They now have their own chip design team, the benefit for them has been their Swift core is almost as powerful as ARM A15, but seems less power hungry, anyway Apple seems to be happy running slower CPUs compared to Android. Until an Android or WP8 or somebody can achieve Apple's margins they will be able to 'buy' their way to GPU domination, as an Android fan it makes me sad:crying:
Click to expand...
Click to collapse
Well said mate!
I can understand what you feel, nowdays android players like samsung,nvidia are focusing more on CPU than GPU.
If they won't stop soon and continued to use this strategy they will fail.
GPU will become bottleneck and you will not be able use the cpu at its full potential. (Atleast when gaming)
i have Galaxy S2 exynos 4 1.2Ghz and 400mhz oc mali gpu
In my analysis most modern games like MC4,NFS:MW aren't running at 60FPS at all thats because GPU always have 100% workload and CPU is relaxing there by outputing 50-70% of total CPU workload
I know some games aren't optimize for all android devices as opposed to apple devices but still even high-end android devices has slower gpu (than ipad 4 atleast )
AFAIK, Galaxy SIV is likely to pack T-604 with some tweaks instead of mighty T-658 which is still slower than iPAddle 4
Turbotab said:
There was a great article on Anandtech that tested the power consumption of the Nexus 10's Exynos 5250 SoC, it showed that both the CPU and GPU had a TDP of 4W, making a theoretical SoC TDP of 8W. However when the GPU was being stressed by running a game, they ran a CPU benchmark in the background, the SoC quickly went up to 8W, but the CPU was quickly throttled from 1.7 GHz to just 800 Mhz as the system tried to keep everything at 4W or below, this explained why the Nexus 10 didn't benchmark as well as we wished.
Back to the 5450 which should beat the A6X, trouble is it has double the CPU & GPU cores of the 5250 and is clocked higher, even on a more advanced 28nm process, which will lower power consumption I feel that system will often be throttled because of power and maybe heat concerns, so it looks amazing on paper but may disappoint in reality, and a 5450 in smartphone is going to suffer even more.
So why does Apple have an advantage?, well basically money, for a start iSheep will pay more for their devices, so they afford to design a big SoC and big batteries that may not be profitable to other companies. Tegra 4 is listed as a 80mm2 chip, iPhone 5 is 96mm2 and A6X is 123mm2, Apple can pack more transistors and reap the GPU performance lead, also they chosen graphics supplier Imagination Technologies have excellent products, Power VR Rogue will only increase Apple's GPU lead. They now have their own chip design team, the benefit for them has been their Swift core is almost as powerful as ARM A15, but seems less power hungry, anyway Apple seems to be happy running slower CPUs compared to Android. Until an Android or WP8 or somebody can achieve Apple's margins they will be able to 'buy' their way to GPU domination, as an Android fan it makes me sad:crying:
Click to expand...
Click to collapse
Typical "isheep" reference, unnecessary.
Why does apple have the advantage? Maybe because there semiconductor team is talented and can tie the A6X+PowerVR GPU efficiently. NIVIDA should have focused more on GPU in my opinion as the CPU was already good enough. With these tablets pushing excess of 250+ppi the graphics processor will play a huge role. They put 72 cores in there processor. Excellent. Will the chip ever be optimized to full potential? No. So again they demonstrated a product that sounds good on paper but real world performance might be a different story.
MrPhilo said:
For goodness sake, this isn't final hardware, anything could change. Hung2900 knows nothing, what he stated isn't true. Samsung has licensed PowerVR, it isn't just stuck to Apple, just that Samsung prefers using ARMs GPU solution. Another thing I dislike is how everyone is comparing a GPU in the iPad 4 (SGX554MP4) that will NEVER arrive in a phone compared a Tegra 4 which will arrive in a phone. If you check OP link the benchmark was posted on the 3rd of January with different results (18fps then 33fps), so there is a chance it'll rival the iPad 4. I love Tegra as Nvidia is pushing developers to make more better games for Android compared to the 'geeks' *cough* who prefers benchmark results, whats the point of having a powerful GPU if the OEM isn't pushing developers to create enhance effect games for there chip.
Hamdir is correct about the GPUs, if Tegra 3 was around 50-80% faster than Tegra 2 with just 4 more cores, I can't really imagine it only being 2x faster than Tegra 3. Plus its a 28nm (at around 80mm2 just a bit bigger than Tegra 3, smaller than A6 90mm2) along with the dual memory than single on Tegra 2/3.
Click to expand...
Click to collapse
Firstly please keep it civil, don't go around saying that people know nothing, people's posts always speak volumes. Also calling people geeks, on XDA is that even an insult, next you're be asking what I deadlift:laugh:
My OP was done in the spirit of technical curiosity, and to counter the typical unrealistic expectations of a new product on mainstream sites, e.g. Nvidia will use Kepler tech (which was false), omg Kepler is like GTX 680, Tegra 4 will own the world, people forget that we are still talking about device that can only use a few watts, and must be passively cooled and not a 200+ watt, dual-fan GPU, even though they both now have to power similar resolutions, which is mental.
I both agree and disagree with your view on Nvidia's developer relationship, THD games do look nice, I compared Infinity Blade 2 on iOS vs Dead Trigger 2 on youtube, and Dead Trigger 2 just looked richer, more particle & physics effects, although IF Blade looked sharper at iPad 4 native resolution, one of the few titles to use the A6x's GPU fully.The downside to this relationship is the further fragmentation of the Android ecosystem, as Chainfire's app showed most of the extra effects can run on non Tegra devices.
Now, a 6 times increase in shader, does not automatically mean that games / benchmarks will scale in linear fashion, as other factors such as TMU /ROP throughput can bottleneck performance. Nvidia's Technical Marketing Manager, when interviewed at CES, said that the overall improvement in games / benchmarks will be around 3 to 4 times T3. Ultimately I hope to see Tegra 4 in a new Nexus 7, and if these benchmarks are proved accurate, it wouldn't stop me buying. Overall including the CPU, it would be a massive upgrade over the current N7, all in the space of a year.
At 50 seconds onwards.
https://www.youtube.com/watch?v=iC7A5AmTPi0
iOSecure said:
Typical "isheep" reference, unnecessary.
Why does apple have the advantage? Maybe because there semiconductor team is talented and can tie the A6X+PowerVR GPU efficiently. NIVIDA should have focused more on GPU in my opinion as the CPU was already good enough. With these tablets pushing excess of 250+ppi the graphics processor will play a huge role. They put 72 cores in there processor. Excellent. Will the chip ever be optimized to full potential? No. So again they demonstrated a product that sounds good on paper but real world performance might be a different story.
Click to expand...
Click to collapse
Sorry Steve, this is an Android forum, or where you too busy buffing the scratches out of your iPhone 5 to notice? I have full respect for the talents of Apple's engineers & marketing department, many of its users less so.
hamdir said:
tegra optimization porting no longer works using chainfire, this is now a myth
did u manage to try shadowgun thd, zombie driver or horn? the answer is no, games that use t3 sdk for physx and other cpu graphics works can not be forced to work on other devices, equally chainfire is now outdated and no longer updated
Click to expand...
Click to collapse
Looks like they haven't updated chain fire 3d for a while as a result only t3 games don't work but others do work rip tide gp, dead trigger etc . It's not a myth but it is outdated and only works with ics and tegra 2 compatible games . I think I (might be) unfortunate too but some gameloft games lagged on tegra device that i had, though root solved it too an extent
I am not saying something is superior to something just that my personal experience I might be wrong I may not be
Tbh I think benchmarks don't matter much unless you see some difference in real world usage and I had that problem with tegra in my experience
But we will have to see if the final version is able to push it above Mali t 604 and more importantly sgx544
Turbotab said:
Sorry Steve, this is an Android forum, or where you too busy buffing the scratches out of your iPhone 5 to notice? I have full respect for the talents of Apple's engineers & marketing department, many of its users less so.
Click to expand...
Click to collapse
No I actually own a nexus 4 and ipad mini so I'm pretty neutral in googles/apples ecosystems and not wiping any scratches off my devices.