[Q] Tegra 2 GPU Is Weak?? - Eee Pad Transformer Q&A, Help & Troubleshooting

I've just emailed the folks at Rubicon development complaining that their Great Little War Game (which is a great game btw) is laggy on my ASUS EEE Pad Transformer which is powered by the Tegra 2 SOC when you set the detail to high or ultra. Setting it to medium plays the game smoothly. Some folk posted a review of the game in the Android market praising the game runs smoothly even on his Incredible S which runs on the Adreno 205 GPU. Running the game on high / ultra adds in an excellent water effect which causes the game to slowdown on the Transformer.
I highlighted this to the Rubicon developers and one of them said that while the Tegra 2 CPU is powerful, the GPU isn't as powerful as the Adreno 205 GPU on the Incredible S. I seriously find that really really hard to believe because the benchmark figures here shows that Tegra 2 even beats the PowerVR GPU in Nexus S which is more powerful than the Adreno 205:
http://androidandme.com/2011/03/news/tegra-2-benchmarks-motorola-atrix-4g-vs-lg-optimus-2x/
So I emailed them to see if they can actually optimize it for Tegra 2 because as I understand it, you have to optimize your game for Tegra 2 chipsets in order to make full use of the GPU much like in games such as RipTide GP which has excellent graphics and I dont believe the Adreno 205 could render the water effect that brilliantly.
Could someone shed some light on this?
Update:
Here's what the developers of "Great little war game" have to say about tegra 2 tablets:
We've done just that with v1.0.4 - have a play in the new settings screen.
We've kind done all we can do now tbh, of the 4 different GPU's in Android phones, Tegra 2 comes bottom and its in most of the devices with the biggest screens.
---
Can you believe that? Sounds to me they are getting lazy rather than trying to optimize it for Tegra 2 because right now, they have two modes in the game with the new update - a fast graphics and a best graphics mode. Fast graphics mode appears to set the game resolution at a horrible 640x480 to speed things up but makes things very very ugly with heavy pixelation on tablets. while best graphics uses the native resolution, everything is sharp but slows down on certain larger levels of the game.
I suggested to them to change this to 800x600 when toggling to fast graphics which i believe will reduce the pixelation effect while still maintaining a decent graphics for the player. What do you guys think?

Nexus S should be right about the same hardware as the Galaxy Tab, and I do run RipTide GP at full quality equally smooth on that thing. Tegra 2 is a kinda all around lame chipset, but you really notice the single core when a mail comes in, and the TF keeps you splashing around while the Galaxy stutters badly for a moment... Also no choice on the chipset at the moment if you want a tablet.
Sent from my Transformer TF101 using Tapatalk

AlexTheStampede said:
Nexus S should be right about the same hardware as the Galaxy Tab, and I do run RipTide GP at full quality equally smooth on that thing. Tegra 2 is a kinda all around lame chipset, but you really notice the single core when a mail comes in, and the TF keeps you splashing around while the Galaxy stutters badly for a moment... Also no choice on the chipset at the moment if you want a tablet.
Sent from my Transformer TF101 using Tapatalk
Click to expand...
Click to collapse
Nexus S is powered by the PowerVR GPU - its not the same as the Galaxy Tab at all (i'm referring to the Galaxy Tab 10.1).

And I'm talking about the one they sold one year ago, the 7" Froyo little monster using the same Hummingbird coupled with an SGX 540 that the Nexus S has
Sent from my Transformer TF101 using Tapatalk

AlexTheStampede said:
And I'm talking about the one they sold one year ago, the 7" Froyo little monster using the same Hummingbird coupled with an SGX 540 that the Nexus S has
Sent from my Transformer TF101 using Tapatalk
Click to expand...
Click to collapse
how did you manage to run RipTide GP which is a Tegra only game? I suppose you used something like chainfire 3d? ( I havent tried it tho)

Exactly, Chainfire 3D and the Tegra plugin. No settings changed it just works, exactly like the pinball game. I didn't try any other game.

It requires a lot more power to render a game at 1280x720 than 800x480. As for which SoC is more powerful, I think benchmarks prove that Tegra 2 > Snapdragon S2.

AlexTheStampede said:
Exactly, Chainfire 3D and the Tegra plugin. No settings changed it just works, exactly like the pinball game. I didn't try any other game.
Click to expand...
Click to collapse
I see. cool. but it still doesnt make sense why Rubicon mentioned that the Adreno 205 is more powerful than the Tegra 2.

Killer Bee said:
It requires a lot more power to render a game at 1280x720 than 800x480.
Click to expand...
Click to collapse
this part i totally agree in terms of resolution but to say the Adreno 205 GPU is more powerful than the Tegra 2 GPU is wrong in every sense. Just because a game plays well in 800x480 doesnt make that GPU more powerful than the Tegra 2 GPU which plays game on 1280x800 in tablets. If it were the same resolution in the incredible S i'm pretty sure the game would be just as sluggish or worse.

mlbl said:
Could someone shed some light on this?
Click to expand...
Click to collapse
Assuming the game runs at the Transformer's native resolution, the ULP GeForce would be required to push nearly three times as many pixels as the Adreno 205 in the Incredible S. Consider then that the ULP GeForce is maybe 50% faster in best-case scenarios.
As for Tegra optimizations, all that really is is a proprietary texture format that only Tegra chips can use. It provides a few benefits.. that can already be implemented in OpenGL ES 2.X anyway.
Edit:
Man it obviously took me a long time to type that.. got ninja'd three times over..
mlbl said:
I see. cool. but it still doesnt make sense why Rubicon mentioned that the Adreno 205 is more powerful than the Tegra 2.
Click to expand...
Click to collapse
Just because they are developers, doesn't mean they aren't ignorant of certain hardware.

Sjael said:
Assuming the game runs at the Transformer's native resolution, the ULP GeForce would be required to push nearly three times as many pixels as the Adreno 205 in the Incredible S. Consider then that the ULP GeForce is maybe 50% faster in best-case scenarios.
As for Tegra optimizations, all that really is is a proprietary texture format that only Tegra chips can use. It provides a few benefits.. that can already be implemented in OpenGL ES 2.X anyway.
Edit:
Man it obviously took me a long time to type that.. got ninja'd three times over..
Just because they are developers, doesn't mean they aren't ignorant of certain hardware.
Click to expand...
Click to collapse
Your explanation makes a lot of sense and yep doesn't mean they are developers, they're always right. Its possible they've made a mistake there and just make some assumptions on their own instead of basing off facts.

To begin with, the "Tegra Zone" means quality exactly like say, Amazon AppStore. Example? Galaxy on Fire 2 runs smooth-ish (I'd say 20fps or so) on an iPhone 3G. Not a 3GS. The mighty 600mhz (is it still underclocked to 400?) money printer Apple sold in 2008 with 128mb of ram. But hey, I'm sure the resolution of the screen is low enough to counterbalance the amazing graphics possible only thanks to Nvidia!
Anyway the Galaxy Tab runs at 1024x600 that is closer to the 1280x720 of the TF (the lower bar is 80 pixels, and there isn't any game using those obviously).

Here's what the developers of "Great little war game" have to say about tegra 2 tablets:
We've done just that with v1.0.4 - have a play in the new settings screen.
We've kind done all we can do now tbh, of the 4 different GPU's in Android phones, Tegra 2 comes bottom and its in most of the devices with the biggest screens.
---
Can you believe that? Sounds to me they are getting lazy rather than trying to optimize it for Tegra 2 because right now, they have two modes in the game with the new update - a fast graphics and a best graphics mode. Fast graphics mode appears to set the game resolution at a horrible 640x480 to speed things up but makes things very very ugly with heavy pixelation on tablets. while best graphics uses the native resolution, everything is sharp but slows down on certain larger levels of the game.
I suggested to them to change this to 800x600 when toggling to fast graphics which i believe will reduce the pixelation effect while still maintaining a decent graphics for the player. What do you guys think?

Sjael said:
Just because they are developers, doesn't mean they aren't ignorant of certain hardware.
Click to expand...
Click to collapse
I gotta bump that one up. I'm in a development shop and the majority of the developers wouldn't know what CPU they had if there wasn't a sticker on their laptop.
Understanding software and hardware is a completely different skill set.

Well, could just be gpu optimization. Adreno uses a VLIW5, similar to AMD gpu, with an extra scalar unit, which apparently is hard to develop for, but can yield great results on shader heavy prgrams that are designed for it (or just happen to favor it). The Adreno 205 does outpace the [email protected] in vertex shader heavy benchmarks... and most implementations of snapdragon only use one of its two memory channels! ( because the first is PoP, but thesecond must be off package).
Sent from my Transformer TF101 using Tapatalk

while the Tegra 2 CPU is powerful, the GPU isn't as powerful as the Adreno 205 GPU on the Incredible S.
Click to expand...
Click to collapse
That's the most retarded thing I've ever heard. Rubicon mainly develops for the iOS platform; I'd reckon they are just lazy at optimizing for several different SOCs.

grainysand said:
That's the most retarded thing I've ever heard. Rubicon mainly develops for the iOS platform; I'd reckon they are just lazy at optimizing for several different SOCs.
Click to expand...
Click to collapse
There is another point I didn't address. Adreno 205 is Unified shaders. So is sgx540. Tegra 2, however, is 4 pixel, 4 vertex. So there are definately situations, even entire genres, that would be potentially slower on Tegra 2 compared to what a benchmark would say.
Sent from my Transformer TF101 using Tapatalk

I had bought Great Little War Game 2 weeks ago. But after few minutes of playing I canceled the purchase and refunded. Why? Because of poor optimization for Asus.
I have this game also on Ipad2. It runs smoothly and much better.

Orion66 said:
I had bought Great Little War Game 2 weeks ago. But after few minutes of playing I canceled the purchase and refunded. Why? Because of poor optimization for Asus.
I have this game also on Ipad2. It runs smoothly and much better.
Click to expand...
Click to collapse
generally not just the asus but all the tegra 2 tablets have poor performance for this game

So is the issue that the tegra 2 in some respects is slower or that developers need to write with the way the tegra 2 gpu works in mind?
Sent from my tf101 using xda premium 1.59Ghz

Related

Electopia Benchmark

For giggles, can one of you that's stock run the Electopia benchmark? There's been some interesting results and it would be cool to see how another dual-core phone with a different CPU/GPU performs. The Sensation folks are obviously not amused.
Sensation
800x480
Average FPS: 23.65
Time: 60
Number of Frames: 1419
Trianglecount: 48976
Peak Trianglecount: 68154
960x540
Average FPS: 19.90
Time: 60.01
Number of Frames: 1194
Trianglecount: 49415
Peak Trianglecount: 67076
SGS2
Average FPS: 37.58
Time: 60.01
Number of frames: 2255
Trianglecount: 48633
Peak trianglecount: 68860
DHD
Average FPS: 23.36
Time: 60.03
Number of frames: 1402
Trianglecount: 48835
Peak trianglecount: 67628
Even the Desire HD blew away my G2x on this benchmark but it could be the custom ROM... I'll switch back to AOSP and try it again.
16FPS
Can't be right, my Thunderbolt smoked my g2x
26 FPS Thunderbolt vs 16FPS G2x
Something is very wrong with those numbers if this is supposed to be measuring opengl 2.0
I have stock and with a really hard time getting it to respond to touch input and with the sound off here are the scores:
Average FPS - 15.56
Time - 60.04
Number of Frames - 934
Trianglecount - 48928
Peak Trianglecount - 68838
This was a super buggy program on the G2x. I think it is definitely not optimized for dual core or at least the Tegra 2 architecture.
Sent from my T-Mobile G2x using XDA App
There's no way the G2X would be lower than the Sensation. The test probably isn't dual-core optimized and the new CPU/GPUs are throwing it off. Thanks for trying though.
BarryH_GEG said:
There's no way the G2X would be lower than the Sensation. The test probably isn't dual-core optimized and the new CPU/GPUs are throwing it off. Thanks for trying though.
Click to expand...
Click to collapse
And there is no way the g2x could be lower than a single core adreno 205 Thunderbolt.
15.57 FPS for me running stock/not rooted. Like previously mentioned, it was very unresponsive to touch.
Badly designed benchmark programs are bad.
diablos991 said:
Badly designed benchmark programs are bad.
Click to expand...
Click to collapse
The sad part is that this isn't just a benchmark - its a game first and foremost.
And yeah I can't get past 16FPS on stock speed OR at 1.5GHz so I think there's definitely coding issues as Nenamark using Trinity on Bionic scores 72FPS. I think my Inspire (Adreno 205) got about 35?
+1
Lets all buy phones with top benchmarks!!!!!!
Better yet lets all get iPhones.....
Fu*k a benchmark
Sent from my LG-P999 using XDA Premium App
But if you really stop and think about it, if each of the different CPU/GPU's behave differently running the same software because of proprietary hardware performance tweaks we'll all be screwed in the long run. No matter how Electopia was written, one would think it would behave the same way on different CPU/GPU combinations - even if it wasn't dual-core optimized. So developers are either going to have to start testing on every CPU/GPU combo to release a single version of an app or release different apps for different CPU/GPUs. It's way too early to tell as dual-core and 2.3ish isn't that common now but it should be interesting watching software performance and development play out in the future.
BarryH_GEG said:
But if you really stop and think about it, if each of the different CPU/GPU's behave differently running the same software because of proprietary hardware performance tweaks we'll all be screwed in the long run. No matter how Electopia was written, one would think it would behave the same way on different CPU/GPU combinations - even if it wasn't dual-core optimized. So developers are either going to have to start testing on every CPU/GPU combo to release a single version of an app or release different apps for different CPU/GPUs. It's way too early to tell as dual-core and 2.3ish isn't that common now but it should be interesting watching software performance and development play out in the future.
Click to expand...
Click to collapse
Its piss-poor coding on the app developer's part - plain and simple. While there are Tegra 2-specific instructions that an app developer can use in their application, there are not any mobile OpenGL 2.0 instructions the Tegra 2 doesn't support as far as I am aware.
If you want a good challenge for the chip, download an3dbench XL from Market. I just scored 32640 and that's with a bunch of background apps.
Isn't this a windows mobile port (had it on my HD2 running WM6.5)? So, how does it provide an accurate representation of gaming on an Android device? Since it is the only bench my G2x has scored poorly on and (more importantly) real world gaming is spectacular on this thing, I'm going to say it doesn't. I wouldn't put a whole lot of stock in this one...
Yeah agreed. I just ran it on the Nexus/CM7 AOSP hybrid and it still was only 16.06 while I got almost 40,000 on an3dbenchXL which put me like 30-something out of 7000ish results.
This application was influenced by Qualcomm specifically to run poorly on Tegra 2 devices. They messed with the shaders so everything is rendered at a weird angle. If you change the code to run with a normal approach, you see the same results on Qualcomm chips but also 3-5x perf on NVIDIA chips
why would you say this benchmark was influenced? if you have the sources ..please share .. so we can all look ... and how can you say BenchXL is a good benchmark? I have run BenchXL Benchmark and seen un matching results on many forums ... it is very unreliable... not a good benchmark. At least electopia gives consistent reliable results... I would go with electopia as a GPU benchmark ..
i have a xperia play for myself - which performs superb for gaming - awesome graphics - i love the games on it - awesome device. my wife has g2x - which is equally good for gaming (thought she just uses it for texting - LOL )....
i think for gaming both xperia play and g2x are good...
I'd hardly say it's biased to any one specific manufacturer based on these benchmarks:
More so I ran it myself with the latest firmware at stock frequencies (SGS2 btw ) and got:
Average FPS: 51.44
Time: 60.02
Number of frames: 3087
Trianglecount: 48827
Peak trianglecount: 68868
Quite funny difference to any other device I might say.
It's not biased towards any manufacturer, it is biased against NVIDIA's ULP GeForce GPUs in Tegra 2 SOCs.
Changes to the code cause increases in performance on Tegra 2 devices, while results on other platforms do not change.
In general, there is never a single, all-encompassing GPU benchmark to accurately compare devices. It all depends on the code, and how it interacts with the specific hardware of the device.
images |DOT| anandtech |DOT| com /graphs/graph4177/35412.png
images |DOT| anandtech |DOT| com /graphs/graph4177/35412.png
Source: Anandtech Samsung Galaxy S2 review (I can't post links )
http://images.anandtech.com/graphs/graph4177/35412.png
That AnandTech review is badly outdated, like I said; the SGS2 gets for example 16fps there in February. I myself get 58fps today.
And I don't think it's biased against Tegra. Tegra performs pretty much there where it should be considering its age, and corresponds to it's specs.
And just to prove dismiss your point that Tegra gets a different codepath, I ran Electopia Bench again via Chainfire3D using the NVIDIA GL wrapper plugin emulating said device and I'm still getting the same amount of FPS.
If what you're saying is that it's not utilizing Tegra's full potential through proprietary Nvidia OpenGL extensions, might as well pack the bag and leave because then that logic would apply to pretty much every graphics core since it's not optimized for it. What we see here in these benchmarks is a plain simple ES 2.0 codepath which all devices should support and so we can do an oranges to oranges comparision. It's also one of the heaviest fragment-shader dependent benchmarks out there for the moment, and less geometry and texture bound, and that's why it runs so badly on pretty much every chip, since they don't get this type of workload in other benchmarks. This is also why the Mali gets such high FPS as that's where the quad GPU setup in the Exynos can shine.
AndreiLux said:
I'd hardly say it's biased to any one specific manufacturer based on these benchmarks:
More so I ran it myself with the latest firmware at stock frequencies (SGS2 btw ) and got:
Average FPS: 51.44
Time: 60.02
Number of frames: 3087
Trianglecount: 48827
Peak trianglecount: 68868
Quite funny difference to any other device I might say.
Click to expand...
Click to collapse
It's clearly MALI 400 in SGS2 is most powerful GPU right now. There is a 60fps limit on Galaxy S2, so you'll need a powerful benchmark. You can also see that in Nenamark2 too. SGS2=47fps, G2X=28fps, SGS=24fps

Can Droid pad keep up with IPAD 3?

The new release of IPAD3 will hit Verizon and ATT on March 16, equipped with 4G+Wifi and non 4G. It's the fact that apple A5X chip (quad core) is 4 time faster that droid tegra 3, higher resolution (2048 x 1536 ipad vs Asus transformer 700series 1920 x 1200, which suppose to release around June, 2012). Together with the high end hardware, apple also boost their software with amazing video editor, autodesk (hand draw graphic design), camera editing software and namco games designed just for Ipad...I'm a droid fan but I just have to admit, Apple is one of warrior that we may NOT able to beat...I'm thinking of pick one up, the one without 4G (extra $30 a month) and use my charge for 4g tethering every time I'm on the road.
You fell for Apple's use of fancy words. The A5X chip is a dual core cpu with a quad core gpu. Yes, you get no more CPU performance, just more graphics performance, which almost nothing takes advantage of fully now with the A5. Besides the updated screen and new software, there is really nothing spectacular about the new iPad. Just like the iPhone 4S update, it's a few minor spec bumps that Apple marketing (they are a marketing company that just happens to also sell stuff) made sound like the latest and greatest technology. Wait until other manufacturers get a chance to actually start using the new and upcoming quad-core SoCs
It's also not a Droid Tegra 3 or Droid pad. It is a Tegra 3 processor, which can be used by anyone on any platform, and Android Tablets. The Tegra 3 is actually a quad core cpu with 12 core gpu (technically 5 core cpu), and who knows how it will benchmark against Apple's chip, since Apple didn't release any firm details about how the new A5X benchmarked higher than the Tegra 3, or how much higher, or if it was a production Tegra 3 chip.
Personally, I would wait to see what Samsung comes up with this summer/fall to counter it, and even if they don't have anything, I would still take a GalaxyTab 7.7 over an iPad any day of the week.
imnuts said:
You fell for Apple's use of fancy words. The A5X chip is a dual core cpu with a quad core gpu. Yes, you get no more CPU performance, just more graphics performance, which almost nothing takes advantage of fully now with the A5. Besides the updated screen and new software, there is really nothing spectacular about the new iPad. Just like the iPhone 4S update, it's a few minor spec bumps that Apple marketing (they are a marketing company that just happens to also sell stuff) made sound like the latest and greatest technology. Wait until other manufacturers get a chance to actually start using the new and upcoming quad-core SoCs
It's also not a Droid Tegra 3 or Droid pad. It is a Tegra 3 processor, which can be used by anyone on any platform, and Android Tablets. The Tegra 3 is actually a quad core cpu with 12 core gpu (technically 5 core cpu), and who knows how it will benchmark against Apple's chip, since Apple didn't release any firm details about how the new A5X benchmarked higher than the Tegra 3, or how much higher, or if it was a production Tegra 3 chip.
Personally, I would wait to see what Samsung comes up with this summer/fall to counter it, and even if they don't have anything, I would still take a GalaxyTab 7.7 over an iPad any day of the week.
Click to expand...
Click to collapse
Yea ***** what he said!....no offense buhohitr
Sent from my TS...Eclipse...TS...Eclipsed Blazing Charge!
sasquatch080 said:
Yea ***** what he said!....no offense buhohitr
Sent from my TS...Eclipse...TS...Eclipsed Blazing Charge!
Click to expand...
Click to collapse
Non taken, I'm here for a nice hobby and for fun....that's all.
Question: are you a Droid fan or Android fan?
Anyway,not much more to say than what Imnuts said. The thing is, the iPad 2's A5 chip proved to beat Tegra 3 in a series of benchmarks, though T3 was running on Honeycomb(and they were offscreen benchmarks). I wonder if the results would have been different if T3 was running on ICS.
What I'm very curious to see is Samsung's offering(higher resolution screen and quad core Exynos). Maybe even with bone stock ICS. That would be my dream tablet
Sent from my SCH-I510 using xda premium
I have friends who are diehard apple fanboys and even they say they're not getting it because its "pretty much the same as the 2"... a lot of people are starting to catch on to what Imnuts said about marketing gimmicks and releasing old tech as "revolutionary"
that said I also agree that I'll be waiting for the next quad core beast from Asus ($200 Google Nexus tablet anyone?) or maybe Samsung but I don't want their ugly ass bloated touchwiz over ICS
p.s. the only good thing I could see being brought about by this 3rd iPad would be hopefully display manufacturers will see that its possible to make higher-than-1080p displays for a reasonable price... also LOL at them for including what is basically a laptop capacity battery in it, and I doubt it would really get 10 hours on 4G
blazing through on my Nexus Prime via XDA app
It still isn't worth getting one if the OS is just a magnified iPhone OS (like it can't do anything new compared to the iPhone 4s)
imnuts said:
You fell for Apple's use of fancy words. The A5X chip is a dual core cpu with a quad core gpu. Yes, you get no more CPU performance, just more graphics performance, which almost nothing takes advantage of fully now with the A5. Besides the updated screen and new software, there is really nothing spectacular about the new iPad. Just like the iPhone 4S update, it's a few minor spec bumps that Apple marketing (they are a marketing company that just happens to also sell stuff) made sound like the latest and greatest technology. Wait until other manufacturers get a chance to actually start using the new and upcoming quad-core SoCs
It's also not a Droid Tegra 3 or Droid pad. It is a Tegra 3 processor, which can be used by anyone on any platform, and Android Tablets. The Tegra 3 is actually a quad core cpu with 12 core gpu (technically 5 core cpu), and who knows how it will benchmark against Apple's chip, since Apple didn't release any firm details about how the new A5X benchmarked higher than the Tegra 3, or how much higher, or if it was a production Tegra 3 chip.
Personally, I would wait to see what Samsung comes up with this summer/fall to counter it, and even if they don't have anything, I would still take a GalaxyTab 7.7 over an iPad any day of the week.
Click to expand...
Click to collapse
Uh... no. iOS is GPU accelerated. Its interface and UI relies heavily on the GPU, much more so than Android. The stronger the GPU, for example, the less checkerboarding you experience when scrolling webpages. Of course this is more than negated by 4x the pixels, so day to day performance may not be as good as on the iPad 2.
In Anand's testing they found the iPad 2's GPU to be about 30% faster on average than the Tegra 3. Apple's 2x claim is probably based on a specific test that targets one of Tegra's numerous weaknesses. That said, the iPad 3 uses the same GPU as the PS VITA (albeit without the dedicated VRAM and to-the-metal programming). It's a beast. The same goes for that hi-res screen, which won't be surpassed by a shipping product for a year at least.
Apple claimed that in order to support such a super high resolution display, the A5X graphic chip is the key. But..what ever, Nvidia is going to bench mark the A5X and their Tegra 3...the true will reveal. I may add if indeed Tegra3 is faster, I'm a very happy guy, cause I have my eyes on the coming Asus Transformer 700 due to release sometime in June 2012. Beside the hardware, Apple also released 5 killer apps; video editor(amazing photo editor and movie maker), autodesk(free hand graphic design), 2 new games from Namco and Epic, icloud garage band where 4 people (drumer,guitar,keyboard,bass) can play the same song over the internet.
http://events.apple.com.edgesuite.net/123pibhargjknawdconwecown/event/index.html
buhohitr said:
Apple claimed that in order to support such a super high resolution display, the A5X graphic chip is the key. But..what ever, Nvidia is going to bench mark the A5X and their Tegra 3...the true will reveal. I may add if indeed Tegra3 is faster, I'm a very happy guy, cause I have my eyes on the coming Asus Transformer 700 due to release sometime in June 2012. Beside the hardware, Apple also released 5 killer apps; video editor(amazing photo editor and movie maker), autodesk(free hand graphic design), 2 new games from Namco and Epic, icloud garage band where 4 people (drumer,guitar,keyboard,bass) can play the same song over the internet.
http://events.apple.com.edgesuite.net/123pibhargjknawdconwecown/event/index.html
Click to expand...
Click to collapse
iCloud seems very similar to Google's cloud services.
As for Autodesk, doesn't that company have an app(or apps) for android as well? Both OS systems have Photoshop Touch, which seems like a great photo editing tools in its own right.
Sent from my SCH-I510 using xda premium
DirgeExtinction said:
iCloud seems very similar to Google's cloud services.
As for Autodesk, doesn't that company have an app(or apps) for android as well? Both OS systems have Photoshop Touch, which seems like a great photo editing tools in its own right.
Sent from my SCH-I510 using xda premium
Click to expand...
Click to collapse
Obviously, you haven't read the link for apple event. We're not talking about photoshop and Autodesk for ipad3 is not same as android. Sure google has gcloud, but here we're talking about using the cloud to play music as a band using Ipad3 as music instrument. Click on the link and find out more....
ambrar12 said:
Uh... no. iOS is GPU accelerated. Its interface and UI relies heavily on the GPU, much more so than Android. The stronger the GPU, for example, the less checkerboarding you experience when scrolling webpages. Of course this is more than negated by 4x the pixels, so day to day performance may not be as good as on the iPad 2.
Click to expand...
Click to collapse
The UI is no more or less accelerated now with ICS for Android from what I've read. The only stuff left in Android that isn't GPU driven isn't going to tax the CPU to render it. The only reason that the added GPU power will be needed is the high resolution, which is pointless IMO. Why get a higher resolution other than to brag about it? You think Samsung/LG/etc. couldn't make a higher resolution display if they wanted to? They're sticking to 1080p/720p because it's a standard resolution and you won't be powering unnecessary pixels if you don't have to, which you will be doing on that display when watching HD videos. Either way, I give it 3-4 months before someone displays a better product, if not sooner. No one has any idea how the OMAP5 really performs, or the quad-core Exynos chips, and if anyone will best the A5X, those are probably the best candidates.
buhohitr said:
Obviously, you haven't read the link for apple event. We're not talking about photoshop and Autodesk for ipad3 is not same as android. Sure google has gcloud, but here we're talking about using the cloud to play music as a band using Ipad3 as music instrument. Click on the link and find out more....
Click to expand...
Click to collapse
Playing music as a band using the iPad 3 using "the cloud"(which pretty much means using the iPad's LTE modem) sounds like what multiplayer games on iOS and Android do.
Sent from my SCH-I510 using xda premium
Ugh iPad is only quicker cause it's less complex OS
Android will always be light years ahead of apple in technological advances it has usb plug ins since atrix and honeycomb
Apple is far more interested in marketing than creating the best device as long as they can keep u buying everything thru their e-stores they will be happy
I have an iPad 2 and tegra devices will ALWAYS have more features and a faster platform nuff said
Sent from my LG-P925 using xda premium
imnuts said:
I would still take a GalaxyTab 7.7 over an iPad any day of the week.
Click to expand...
Click to collapse
The display on the Galaxy Tab 7.7 is gorgeous. Also, I have some Apple fanboy coworkers who admit that having a 7" iPad would be awesome, and that typing/holding an iPad can be cumbersome.
Oh iPad, you are so revolutionary.
Side rant: I love how Apple upgraded the cameras on the iPad3 and acted like it's a big deal. Cameras... on a tablet.. Who in the **** uses their tablet's cameras that often to warrant a hardware upgrade? The cameras were fine! How about updating the damn OS.
Don't get me started on Siri. Aimbot, anyone?
OK, for better comparison, let just comparing ONLY 4G device; Ipad3 clearly the winner here over any android tablet currently on the Market. Let's not even talk about the future, because it's endless and mute discussion.
People are so offensive when ever we talk about the "competitor". Like 'Ipad3 has the highest screen resolution", Oh that's not needed, it's not make that much of different. Come on admitted, they do have higher resolution than any other device out there. Sometime, we should hold back our egos and watch and listen, we may learn something new. I'm an Android fan, but sometime the competitor is better and I will admit they're better and sometime when Android is better, we should be proud. Only kids bashing other kids.
buhohitr said:
OK, for better comparison, let just comparing ONLY 4G device; Ipad3 clearly the winner here over any android tablet currently on the Market. Let's not even talk about the future, because it's endless and mute discussion.
People are so offensive when ever we talk about the "competitor". Like 'Ipad3 has the highest screen resolution", Oh that's not needed, it's not make that much of different. Come on admitted, they do have higher resolution than any other device out there. Sometime, we should hold back our egos and watch and listen, we may learn something new. I'm an Android fan, but sometime the competitor is better and I will admit they're better and sometime when Android is better, we should be proud. Only kids bashing other kids.
Click to expand...
Click to collapse
I agree. Having competition is good for Google. I have recently changed my view of IOS devices, and I'd have no problem getting an IP3 ..(if it were free and Jail broken.)
letsgophillyingeneral said:
Side rant: I love how Apple upgraded the cameras on the iPad3 and acted like it's a big deal. Cameras... on a tablet.. Who in the **** uses their tablet's cameras that often to warrant a hardware upgrade? The cameras were fine! How about updating the damn OS.
Click to expand...
Click to collapse
The FFC is still the same VGA camera that was on the iPad2, they only upgraded the rear camera.
buhohitr said:
OK, for better comparison, let just comparing ONLY 4G device; Ipad3 clearly the winner here over any android tablet currently on the Market. Let's not even talk about the future, because it's endless and mute discussion.
People are so offensive when ever we talk about the "competitor". Like 'Ipad3 has the highest screen resolution", Oh that's not needed, it's not make that much of different. Come on admitted, they do have higher resolution than any other device out there. Sometime, we should hold back our egos and watch and listen, we may learn something new. I'm an Android fan, but sometime the competitor is better and I will admit they're better and sometime when Android is better, we should be proud. Only kids bashing other kids.
Click to expand...
Click to collapse
I would still take the GalaxyTab 7.7. And yes, it has a higher screen resolution, but how is it useful? The only good thing about that high resolution is that you could see pictures without zooming in as much. The more screen real estate you're getting, it's going to take developers a while to take advantage of it because it is such a huge increase. It's pointless for movies and videos, when is the last time you saw a movie with a resolution greater than 1080p? For video viewing, you have wasted screen space. I'm sure the display looks great, but until we actually have videos that are >1080i/p, I don't see a reason to have a higher resolution display.
imnuts said:
The FFC is still the same VGA camera that was on the iPad2, they only upgraded the rear camera.
I would still take the GalaxyTab 7.7. And yes, it has a higher screen resolution, but how is it useful? The only good thing about that high resolution is that you could see pictures without zooming in as much. The more screen real estate you're getting, it's going to take developers a while to take advantage of it because it is such a huge increase. It's pointless for movies and videos, when is the last time you saw a movie with a resolution greater than 1080p? For video viewing, you have wasted screen space. I'm sure the display looks great, but until we actually have videos that are >1080i/p, I don't see a reason to have a higher resolution display.
Click to expand...
Click to collapse
Higher resolution is great for gaming!
Ipad 3 is amazing but dont android OS my favorite tablet samsung asus vs.!

Potential Alternative/Competitor to TF700

This is being released apparently in close proximity to the release date of the TF700 - probably by no accident.
Considering all the reservations people have had with the TF700 Tegra 3 and I/O issues it might pay to wait two more weeks and see how this performs:
http://www.jr.com/samsung/pe/SAM_N8013EAVXAR/
http://www.engadget.com/2012/06/13/galaxy-note-10-1-pre-order-amazon/
Amazon also had a pre-order up and pulled it. So the release date is in question. Would be interesting if it happens go on sale on July 16. Certainly seems like its release is imminent.
I was certain that I would buy an Infininty as soon as it becomes available in the US. Now, for the first time, I think I might wait. A 10 inch tablet with a quad core exynos sounds impressive, and might really give all tegra 3 tabs some serious competiton.
Trade off is a 720p screen and no dock for potentionally much more capable processor. And of course you would have to consider Samsung's lousy track record with software updates. Course there would likely be big potentional for custom firmware, and then there is the S pen stylus thingy which is sort of interesting...
Touch Wizz is a bit of a turn off for me. The exynos, although better in some synthetic benchmarks will probably be no better for day to day use and the lack of 1080p display is a real bummer. I would still go with an Infinity if it were available.
I agree about touchwiz. I hate it with a passion. On my galaxy note phone, the home screen doesn't even rotate to landscape - at least I haven't found a way to make it do so.
I disagree about exynos however. You can't judge a processor that hasn't been released and benchmarked yet (however synthetically) or tested in real world use.
Everyone touted the Tegra 2 as being the best thing since sliced bread, well before it was released, and same with the Tegra 3
Both have been less than perfect in actual practice. I think I will wait and see. Also remember Samsung is responsible for some very impressive displays - the retina display of the ipad 3, and the amoled screen of the tab 7.7 and excite 7.7 for example. The display of this tab might be quite good. Won't know until its available.
Also, I'll bet a quad core exynos pushing a 720p screen will be blazing fast compared to a Tegra 3 trying to push a 1080p screen..
Honestly, if just the web browsing were smooth it would be a major improvement!
Sounds like a great unit. Shame indeed it doesn't have the battery-powered dock. I want the best of both worlds!
I was considering the Note, but it hasn't got a HD screen and to be honest the keyboard dock clinches it for me...
Will need to see the screen resolution before assessing if this really is a competitor to the TF700.
If you don't know that tf700 holds the benchmark for the best CPU, you shouldn't be talking , it uses Tegra 3 t33 which is the best as far as cpu's and gpu's go and if you guys knew a single thing not the CPU pushes the pixels but the GPU in which case the exynos has a very slight advantage because of the lower resolution screen but Samsung will use the cheap plastic back, and worse back camera, and ****ty updates, sure the infinity has some I/o problems but mine hardly lags, sure it's inconsistent but, but it can be fixed.. sorry no go for samsung
Sent from my Jelly Beaned GNexus
mdemons12 said:
Touch Wizz is a bit of a turn off for me. The exynos, although better in some synthetic benchmarks will probably be no better for day to day use and the lack of 1080p display is a real bummer. I would still go with an Infinity if it were available.
Click to expand...
Click to collapse
Wrong wrong wrong... the exynos is slightly behind t33
Sent from my Jelly Beaned GNexus
ray3andrei said:
If you don't know that tf700 holds the benchmark for the best CPU, you shouldn't be talking , it uses Tegra 3 t33 which is the best as far as cpu's and gpu's go and if you guys knew a single thing not the CPU pushes the pixels but the GPU in which case the exynos has a very slight advantage because of the lower resolution screen but Samsung will use the cheap plastic back, and worse back camera, and ****ty updates, sure the infinity has some I/o problems but mine hardly lags, sure it's inconsistent but, but it can be fixed.. sorry no go for samsung
Sent from my Jelly Beaned GNexus
Click to expand...
Click to collapse
All the benchmarks point that the Exynos found on the GS3 its more powerful than the Tegra 3 T30 wich is almost the same as the Tegra 3 T33 found on the new infinity. I don't think the Tegra 3 will be powerful enough to handle such a big screen resolution.
Here is the chart for GPU performance.
as you can see the new exynos is much powerful than the Tegra 3.
Don't go with a Tegra 3 full hd screen tablet. Wait for a better GPU
Sent from my GT-I9300 using xda premium
ray3andrei said:
Wrong wrong wrong... the exynos is slightly behind t33
Sent from my Jelly Beaned GNexus
Click to expand...
Click to collapse
Is it, that's great to hear! I was going by the T30 vs the Exynos quad core not the T33
Best advice though, feel them both if you can, find out which feels best and has the best experience. I doubt the T33 is too weak for the 1080p display as it is still a very up to date CPU. You have Samsung crappy build quality vs aluminium and a little plastic on the TFI, for me thats a winner.
if this had a better screen and USB port, i'd consider it. If the Toshiba Excite 10, took a micro sd card ( i already purchased a couple of 64gb micro cards) and had a USB port, i'd consider that too. Damn you Asus for making something that has everything i need, but then making these mistakes.
I'd strongly consider it if it had a high DPI screen. As is, its not an option for me. Still picking up a TF700 in a week when it launches.
josuetenista said:
All the benchmarks point that the Exynos found on the GS3 its more powerful than the Tegra 3 T30 wich is almost the same as the Tegra 3 T33 found on the new infinity. I don't think the Tegra 3 will be powerful enough to handle such a big screen resolution.
Here is the chart for GPU performance.
as you can see the new exynos is much powerful than the Tegra 3.
Don't go with a Tegra 3 full hd screen tablet. Wait for a better GPU
Sent from my GT-I9300 using xda premium
Click to expand...
Click to collapse
geforceulp is powerfull enough despite it being a little slower that mali 400, but you cant deny that cpu is superior to exynos quad
Sent from my Asus Transformer Pad Infinity
josuetenista said:
All the benchmarks point that the Exynos found on the GS3 its more powerful than the Tegra 3 T30 wich is almost the same as the Tegra 3 T33 found on the new infinity. I don't think the Tegra 3 will be powerful enough to handle such a big screen resolution.
Here is the chart for GPU performance.
as you can see the new exynos is much powerful than the Tegra 3.
Don't go with a Tegra 3 full hd screen tablet. Wait for a better GPU
Sent from my GT-I9300 using xda premium
Click to expand...
Click to collapse
besides jelly bean is around the corner for the infinity so performance wont be an issue and btw i can run dead trigger without an issue, and it is optimized for 1920x1200
Sent from my Asus Transformer Pad Infinity
ray3andrei said:
geforceulp is powerfull enough despite it being a little slower that mali 400, but you cant deny that cpu is superior to exynos quad
View attachment 1189381
Sent from my Asus Transformer Pad Infinity
Click to expand...
Click to collapse
Thanks for the benchmark results. Actually, could you tell me if you always get a similar result? And what governor and power mode were these done on? (performance?)
My results tend to vary, in various benchmarks too.
Does anyone know how much better the SPen is compared to a regular capacitive stylus. I know it has the pressure sensitivity and all, but can it compare to a Wacom Bamboo in the slightest?
reluttr said:
Does anyone know how much better the SPen is compared to a regular capacitive stylus. I know it has the pressure sensitivity and all, but can it compare to a Wacom Bamboo in the slightest?
Click to expand...
Click to collapse
AFAIK its main strength is being used with Samsung's "Note" devices because of the induction technology, otherwise it won't give better results.
It seems to bounce around 12000 on antutu, 4800 on quadrant and 1500 they're not always constant and I did it on balanced and I seem to get better results than performance
Could you post some of your benchmark results ?
Sent from my Jelly Beaned GNexus

			
				
ray3andrei said:
It seems to bounce around 12000 on antutu, 4800 on quadrant and 1500 they're not always constant and I did it on balanced and I seem to get better results than performance
Click to expand...
Click to collapse
Interesting. I get all sorts of results. Maybe less varied with sio. Probably something faulty about my device, but it runs ok apart from that. I'd still prefer the Krait.
Could you post some of your benchmark results ?
Click to expand...
Click to collapse
Sure, first thing tomorrow [I'm on low battery now ;>]
So are you running the "interactive" CPU governor? (+ sio or noop?)
What are your LinPack scores? Are they consistent?
My last result in AnTuTu was 12219, but these were done in performance mode + with performance governor and I remember getting as low as 4xxx in balanced mode with noop.

TEGRA 4 - 1st possible GLBenchmark!!!!!!!! - READ ON

Who has been excited by the Tegra 4 rumours?, last night's Nvidia CES announcement was good, but what we really want are cold-hard BENCHMARKS.
I found an interesting mention of Tegra T114 SoC on a Linux Kernel site, which I've never heard of. I got really interested when it stated that the SoC is based on ARM A15 MP, it must be Tegra 4. I checked the background of the person who posted the kernel patch, he is a senior Nvidia Kernel engineer based in Finland.
https://lkml.org/lkml/2012/12/20/99
"This patchset adds initial support for the NVIDIA's new Tegra 114
SoC (T114) based on the ARM Cortex-A15 MP. It has the minimal support
to allow the kernel to boot up into shell console. This can be used as
a basis for adding other device drivers for this SoC. Currently there
are 2 evaluation boards available, "Dalmore" and "Pluto"."
On the off chance I decided to search www.glbenchmark.com for the 2 board names, Dalmore (a tasty whisky!) and Pluto (Planet, Greek God and cartoon dog!) Pluto returned nothing, but Dalmore returned a device called 'Dalmore Dalmore' that was posted on 3rd January 2013. However the OP had already deleted them, but thanks to Google Cache I found the results
RESULTS
GL_VENDOR NVIDIA Corporation
GL_VERSION OpenGL ES 2.0 17.01235
GL_RENDERER NVIDIA Tegra
From System spec, It runs Android 4.2.1, a Min frequency of 51 MHz and Max of 1836 Mhz
Nvidia DALMORE
GLBenchmark 2.5 Egypt HD C24Z16 - Offscreen (1080p) : 32.6 fps
iPad 4
GLBenchmark 2.5 Egypt HD C24Z16 - Offscreen (1080p): 49.6 fps
CONCLUSION
Anandtech has posted that Tegra 4 doesn't use unified shaders, so it's not based on Kepler. I reckon that if Nvidia had a brand new GPU they would have shouted about it at CES, the results I've found indicate that Tegra 4 is between 1 to 3 times faster than Tegra 3.
BUT, this is not 100% guaranteed to be a Tegra 4 system, but the evidence is strong that it is a T4 development board. If this is correct, we have to figure that it is running beta drivers, Nexus 10 is ~ 10% faster than the Arndale dev board with the same Exynos 5250 SoC. Even if Tegra 4 gets better drivers, it seems like the SGX 544 MP4 in the A6X is still the faster GPU, with Tegra 4 and Mali T604 being an almost equal 2nd. Nvidia has said that T4 is faster than A6X, but the devil is in the detail, in CPU benchmarks I can see that being true, but not for graphics.
UPDATE - Just to add to the feeling that that this legit, the GLBenchmark - System section lists the "android.os.Build.USER" as buildbrain. Buildbrain according to a Nvidia job posting is "Buildbrain is a mission-critical, multi-tier distributed computing system that performs mobile builds and automated tests each day, enabling NVIDIA's high performance development teams across the globe to develop and deliver NVIDIA's mobile product line"
http://jobsearch.naukri.com/job-lis...INEER-Nvidia-Corporation--2-to-4-130812500024
I posted the webcache links to GLBenchmark pages below, if they disappear from cache, I've saved a copy of the webpages, which I can upload, Enjoy
GL BENCHMARK - High Level
http://webcache.googleusercontent.c...p?D=Dalmore+Dalmore+&cd=1&hl=en&ct=clnk&gl=uk
GL BENCHMARK - Low Level
http://webcache.googleusercontent.c...e&testgroup=lowlevel&cd=1&hl=en&ct=clnk&gl=uk
GL BENCHMARK - GL CONFIG
http://webcache.googleusercontent.c...Dalmore&testgroup=gl&cd=1&hl=en&ct=clnk&gl=uk
GL BENCHMARK - EGL CONFIG
http://webcache.googleusercontent.c...almore&testgroup=egl&cd=1&hl=en&ct=clnk&gl=uk
GL BENCHMARK - SYSTEM
http://webcache.googleusercontent.c...ore&testgroup=system&cd=1&hl=en&ct=clnk&gl=uk
OFFSCREEN RESULTS
http://webcache.googleusercontent.c...enchmark.com+dalmore&cd=4&hl=en&ct=clnk&gl=uk
http://www.anandtech.com/show/6550/...00-5th-core-is-a15-28nm-hpm-ue-category-3-lte
Is there any Gpu that could outperform iPad4 before iPad5 comes out? adreno 320, t Mali 604 now tegra 4 aren't near it. Qualcomm won't release anything till q4 I guess, and tegra 4 has released too only thing that is left is I guess is t Mali 658 coming with exynos 5450 (doubtfully when it would release, not sure it will be better )
Looks like apple will hold the crown in future too .
i9100g user said:
Is there any Gpu that could outperform iPad4 before iPad5 comes out? adreno 320, t Mali 604 now tegra 4 aren't near it. Qualcomm won't release anything till q4 I guess, and tegra 4 has released too only thing that is left is I guess is t Mali 658 coming with exynos 5450 (doubtfully when it would release, not sure it will be better )
Looks like apple will hold the crown in future too .
Click to expand...
Click to collapse
There was a great article on Anandtech that tested the power consumption of the Nexus 10's Exynos 5250 SoC, it showed that both the CPU and GPU had a TDP of 4W, making a theoretical SoC TDP of 8W. However when the GPU was being stressed by running a game, they ran a CPU benchmark in the background, the SoC quickly went up to 8W, but the CPU was quickly throttled from 1.7 GHz to just 800 Mhz as the system tried to keep everything at 4W or below, this explained why the Nexus 10 didn't benchmark as well as we wished.
Back to the 5450 which should beat the A6X, trouble is it has double the CPU & GPU cores of the 5250 and is clocked higher, even on a more advanced 28nm process, which will lower power consumption I feel that system will often be throttled because of power and maybe heat concerns, so it looks amazing on paper but may disappoint in reality, and a 5450 in smartphone is going to suffer even more.
So why does Apple have an advantage?, well basically money, for a start mapple fans will pay more for their devices, so they afford to design a big SoC and big batteries that may not be profitable to other companies. Tegra 4 is listed as a 80mm2 chip, iPhone 5 is 96mm2 and A6X is 123mm2, Apple can pack more transistors and reap the GPU performance lead, also they chosen graphics supplier Imagination Technologies have excellent products, Power VR Rogue will only increase Apple's GPU lead. They now have their own chip design team, the benefit for them has been their Swift core is almost as powerful as ARM A15, but seems less power hungry, anyway Apple seems to be happy running slower CPUs compared to Android. Until an Android or WP8 or somebody can achieve Apple's margins they will be able to 'buy' their way to GPU domination, as an Android fan it makes me sad:crying:
32fps is no go...lets hope it's not final
hamdir said:
32fps is no go...lets hope it's not final
Click to expand...
Click to collapse
It needs to, but it will be OK for a new Nexus 7
still faster enough for me, I dont game alot on my nexus 7.
I know I'm taking about phones here ... But the iPhone 5 GPU and adreno 320 are very closely matched
Sent from my Nexus 4 using Tapatalk 2
italia0101 said:
I know I'm taking about phones here ... But the iPhone 5 GPU and adreno 320 are very closely matched
Sent from my Nexus 4 using Tapatalk 2
Click to expand...
Click to collapse
From what I remember the iPhone 5 and the new iPad wiped the floor with Nexus 4 and 10. The ST-Ericsson Nova A9600 is likely to have a PowerVR Rogue GPU. Just can't wait!!
adityak28 said:
From what I remember the iPhone 5 and the new iPad wiped the floor with Nexus 4 and 10. The ST-Ericsson Nova A9600 is likely to have a PowerVR Rogue GPU. Just can't wait!!
Click to expand...
Click to collapse
That isn't true , check glbenchmark , in the off screen test the iPhone scored 91 , the nexus 4 scored 88 ... That ksnt wiping my floors
Sent from my Nexus 10 using Tapatalk HD
Its interesting how even though nvidia chips arent the best we still get the best game graphics because of superior optimization through tegra zone. Not even the a6x is as fully optimized.
Sent from my SAMSUNG-SGH-I727 using xda premium
ian1 said:
Its interesting how even though nvidia chips arent the best we still get the best game graphics because of superior optimization through tegra zone. Not even the a6x is as fully optimized.
Sent from my SAMSUNG-SGH-I727 using xda premium
Click to expand...
Click to collapse
What sort of 'optimisation' do you mean? un optimised games lag that's a big letdown and tegra effects can also be used on other phones too with chain fire 3d I use it and tegra games work without lag with effects and I don't have a tegra device
With a tegra device I am restricted to optimised games mostly
The graphic performance of NVIDIA SoCs is always disappointed, sadly for the VGA dominanting provider on the world.
The first Tegra2, the GPU is a little bit better than SGX540 of GalaxyS a little bit in benchmark, but lacking NEON support.
The second one Tegra 3, the GPU is nearly the same as the old Mali400Mp4 in GALAXY S2/Original Note.
And now it's better but still nothing special and outperformed soon (Adreno 330 and next-gen Mali)
Strongest PowerVR GPUs are always the best, but sadly they are exclusive for Apple only (SGX543 and maybe SGX 554 also, only Sony ,who has the cross-licencing with Apple, has it in PS Vita and in PS Vita only)
tegra optimization porting no longer works using chainfire, this is now a myth
did u manage to try shadowgun thd, zombie driver or horn? the answer is no, games that use t3 sdk for physx and other cpu graphics works can not be forced to work on other devices, equally chainfire is now outdated and no longer updated
now about power vr they are only better in real multicore configuration which is only used by apple and Sony's vita, eating large die area, ie actual multicore each with its own subcores/shaders, if tegra was used in real multi core it would destroy all
finally this is really funny all this doom n gloom because of an early discarded development board benchmark, I dont mean to take away from turbo's thunder and his find but truly its ridiculous the amount of negativity its is collecting before any type of final device benchs
adrena 220 doubled in performance after the ICS update on sensation
t3 doubled the speed of t2 gpu with only 50% the number of shaders so how on earth do you believe only 2x the t3 scores with 600% more shaders!!
do you have any idea how miserable the ps3 performed in its early days? even new desktop GeForces perform much less than expected until the drivers are updated
enough with the FUD! seems this board is full of it nowadays and so little reasoning...
For goodness sake, this isn't final hardware, anything could change. Hung2900 knows nothing, what he stated isn't true. Samsung has licensed PowerVR, it isn't just stuck to Apple, just that Samsung prefers using ARMs GPU solution. Another thing I dislike is how everyone is comparing a GPU in the iPad 4 (SGX554MP4) that will NEVER arrive in a phone compared a Tegra 4 which will arrive in a phone. If you check OP link the benchmark was posted on the 3rd of January with different results (18fps then 33fps), so there is a chance it'll rival the iPad 4. I love Tegra as Nvidia is pushing developers to make more better games for Android compared to the 'geeks' *cough* who prefers benchmark results, whats the point of having a powerful GPU if the OEM isn't pushing developers to create enhance effect games for there chip.
Hamdir is correct about the GPUs, if Tegra 3 was around 50-80% faster than Tegra 2 with just 4 more cores, I can't really imagine it only being 2x faster than Tegra 3. Plus its a 28nm (at around 80mm2 just a bit bigger than Tegra 3, smaller than A6 90mm2) along with the dual memory than single on Tegra 2/3.
Turbotab said:
There was a great article on Anandtech that tested the power consumption of the Nexus 10's Exynos 5250 SoC, it showed that both the CPU and GPU had a TDP of 4W, making a theoretical SoC TDP of 8W. However when the GPU was being stressed by running a game, they ran a CPU benchmark in the background, the SoC quickly went up to 8W, but the CPU was quickly throttled from 1.7 GHz to just 800 Mhz as the system tried to keep everything at 4W or below, this explained why the Nexus 10 didn't benchmark as well as we wished.
Back to the 5450 which should beat the A6X, trouble is it has double the CPU & GPU cores of the 5250 and is clocked higher, even on a more advanced 28nm process, which will lower power consumption I feel that system will often be throttled because of power and maybe heat concerns, so it looks amazing on paper but may disappoint in reality, and a 5450 in smartphone is going to suffer even more.
So why does Apple have an advantage?, well basically money, for a start iSheep will pay more for their devices, so they afford to design a big SoC and big batteries that may not be profitable to other companies. Tegra 4 is listed as a 80mm2 chip, iPhone 5 is 96mm2 and A6X is 123mm2, Apple can pack more transistors and reap the GPU performance lead, also they chosen graphics supplier Imagination Technologies have excellent products, Power VR Rogue will only increase Apple's GPU lead. They now have their own chip design team, the benefit for them has been their Swift core is almost as powerful as ARM A15, but seems less power hungry, anyway Apple seems to be happy running slower CPUs compared to Android. Until an Android or WP8 or somebody can achieve Apple's margins they will be able to 'buy' their way to GPU domination, as an Android fan it makes me sad:crying:
Click to expand...
Click to collapse
Well said mate!
I can understand what you feel, nowdays android players like samsung,nvidia are focusing more on CPU than GPU.
If they won't stop soon and continued to use this strategy they will fail.
GPU will become bottleneck and you will not be able use the cpu at its full potential. (Atleast when gaming)
i have Galaxy S2 exynos 4 1.2Ghz and 400mhz oc mali gpu
In my analysis most modern games like MC4,NFS:MW aren't running at 60FPS at all thats because GPU always have 100% workload and CPU is relaxing there by outputing 50-70% of total CPU workload
I know some games aren't optimize for all android devices as opposed to apple devices but still even high-end android devices has slower gpu (than ipad 4 atleast )
AFAIK, Galaxy SIV is likely to pack T-604 with some tweaks instead of mighty T-658 which is still slower than iPAddle 4
Turbotab said:
There was a great article on Anandtech that tested the power consumption of the Nexus 10's Exynos 5250 SoC, it showed that both the CPU and GPU had a TDP of 4W, making a theoretical SoC TDP of 8W. However when the GPU was being stressed by running a game, they ran a CPU benchmark in the background, the SoC quickly went up to 8W, but the CPU was quickly throttled from 1.7 GHz to just 800 Mhz as the system tried to keep everything at 4W or below, this explained why the Nexus 10 didn't benchmark as well as we wished.
Back to the 5450 which should beat the A6X, trouble is it has double the CPU & GPU cores of the 5250 and is clocked higher, even on a more advanced 28nm process, which will lower power consumption I feel that system will often be throttled because of power and maybe heat concerns, so it looks amazing on paper but may disappoint in reality, and a 5450 in smartphone is going to suffer even more.
So why does Apple have an advantage?, well basically money, for a start iSheep will pay more for their devices, so they afford to design a big SoC and big batteries that may not be profitable to other companies. Tegra 4 is listed as a 80mm2 chip, iPhone 5 is 96mm2 and A6X is 123mm2, Apple can pack more transistors and reap the GPU performance lead, also they chosen graphics supplier Imagination Technologies have excellent products, Power VR Rogue will only increase Apple's GPU lead. They now have their own chip design team, the benefit for them has been their Swift core is almost as powerful as ARM A15, but seems less power hungry, anyway Apple seems to be happy running slower CPUs compared to Android. Until an Android or WP8 or somebody can achieve Apple's margins they will be able to 'buy' their way to GPU domination, as an Android fan it makes me sad:crying:
Click to expand...
Click to collapse
Typical "isheep" reference, unnecessary.
Why does apple have the advantage? Maybe because there semiconductor team is talented and can tie the A6X+PowerVR GPU efficiently. NIVIDA should have focused more on GPU in my opinion as the CPU was already good enough. With these tablets pushing excess of 250+ppi the graphics processor will play a huge role. They put 72 cores in there processor. Excellent. Will the chip ever be optimized to full potential? No. So again they demonstrated a product that sounds good on paper but real world performance might be a different story.
MrPhilo said:
For goodness sake, this isn't final hardware, anything could change. Hung2900 knows nothing, what he stated isn't true. Samsung has licensed PowerVR, it isn't just stuck to Apple, just that Samsung prefers using ARMs GPU solution. Another thing I dislike is how everyone is comparing a GPU in the iPad 4 (SGX554MP4) that will NEVER arrive in a phone compared a Tegra 4 which will arrive in a phone. If you check OP link the benchmark was posted on the 3rd of January with different results (18fps then 33fps), so there is a chance it'll rival the iPad 4. I love Tegra as Nvidia is pushing developers to make more better games for Android compared to the 'geeks' *cough* who prefers benchmark results, whats the point of having a powerful GPU if the OEM isn't pushing developers to create enhance effect games for there chip.
Hamdir is correct about the GPUs, if Tegra 3 was around 50-80% faster than Tegra 2 with just 4 more cores, I can't really imagine it only being 2x faster than Tegra 3. Plus its a 28nm (at around 80mm2 just a bit bigger than Tegra 3, smaller than A6 90mm2) along with the dual memory than single on Tegra 2/3.
Click to expand...
Click to collapse
Firstly please keep it civil, don't go around saying that people know nothing, people's posts always speak volumes. Also calling people geeks, on XDA is that even an insult, next you're be asking what I deadlift:laugh:
My OP was done in the spirit of technical curiosity, and to counter the typical unrealistic expectations of a new product on mainstream sites, e.g. Nvidia will use Kepler tech (which was false), omg Kepler is like GTX 680, Tegra 4 will own the world, people forget that we are still talking about device that can only use a few watts, and must be passively cooled and not a 200+ watt, dual-fan GPU, even though they both now have to power similar resolutions, which is mental.
I both agree and disagree with your view on Nvidia's developer relationship, THD games do look nice, I compared Infinity Blade 2 on iOS vs Dead Trigger 2 on youtube, and Dead Trigger 2 just looked richer, more particle & physics effects, although IF Blade looked sharper at iPad 4 native resolution, one of the few titles to use the A6x's GPU fully.The downside to this relationship is the further fragmentation of the Android ecosystem, as Chainfire's app showed most of the extra effects can run on non Tegra devices.
Now, a 6 times increase in shader, does not automatically mean that games / benchmarks will scale in linear fashion, as other factors such as TMU /ROP throughput can bottleneck performance. Nvidia's Technical Marketing Manager, when interviewed at CES, said that the overall improvement in games / benchmarks will be around 3 to 4 times T3. Ultimately I hope to see Tegra 4 in a new Nexus 7, and if these benchmarks are proved accurate, it wouldn't stop me buying. Overall including the CPU, it would be a massive upgrade over the current N7, all in the space of a year.
At 50 seconds onwards.
https://www.youtube.com/watch?v=iC7A5AmTPi0
iOSecure said:
Typical "isheep" reference, unnecessary.
Why does apple have the advantage? Maybe because there semiconductor team is talented and can tie the A6X+PowerVR GPU efficiently. NIVIDA should have focused more on GPU in my opinion as the CPU was already good enough. With these tablets pushing excess of 250+ppi the graphics processor will play a huge role. They put 72 cores in there processor. Excellent. Will the chip ever be optimized to full potential? No. So again they demonstrated a product that sounds good on paper but real world performance might be a different story.
Click to expand...
Click to collapse
Sorry Steve, this is an Android forum, or where you too busy buffing the scratches out of your iPhone 5 to notice? I have full respect for the talents of Apple's engineers & marketing department, many of its users less so.
hamdir said:
tegra optimization porting no longer works using chainfire, this is now a myth
did u manage to try shadowgun thd, zombie driver or horn? the answer is no, games that use t3 sdk for physx and other cpu graphics works can not be forced to work on other devices, equally chainfire is now outdated and no longer updated
Click to expand...
Click to collapse
Looks like they haven't updated chain fire 3d for a while as a result only t3 games don't work but others do work rip tide gp, dead trigger etc . It's not a myth but it is outdated and only works with ics and tegra 2 compatible games . I think I (might be) unfortunate too but some gameloft games lagged on tegra device that i had, though root solved it too an extent
I am not saying something is superior to something just that my personal experience I might be wrong I may not be
Tbh I think benchmarks don't matter much unless you see some difference in real world usage and I had that problem with tegra in my experience
But we will have to see if the final version is able to push it above Mali t 604 and more importantly sgx544
Turbotab said:
Sorry Steve, this is an Android forum, or where you too busy buffing the scratches out of your iPhone 5 to notice? I have full respect for the talents of Apple's engineers & marketing department, many of its users less so.
Click to expand...
Click to collapse
No I actually own a nexus 4 and ipad mini so I'm pretty neutral in googles/apples ecosystems and not wiping any scratches off my devices.

Samsung's Octa-Core Exynos 5 processor (vs) Nvidia Tegra 4

Which processor is better and why? I'm thinking about getting the Samsung Galaxy Tab S in July. But I'm also hearing great things about the Asus Infinity Transformer TF701 with the Tegra 4. Better graphics? Faster? Appreciate all the input guys.
Sent from my Tablet using Tapatalk
Tegra 4 has better graphics and probably better optimised games than the Mali on the Exynos. CPU wise, I think the CPU on the Exynos is slightly better.
xRevilatioNx said:
Which processor is better and why? I'm thinking about getting the Samsung Galaxy Tab S in July. But I'm also hearing great things about the Asus Infinity Transformer TF701 with the Tegra 4. Better graphics? Faster? Appreciate all the input guys.
Click to expand...
Click to collapse
Since Tegra's dead I'd say go with the one that's got a future...
NVIDIA says the mainstream tablet and smartphone market is no longer their focus
May 22nd 2014 by Quentyn Kennemer
Once upon a time NVIDIA made plays to try and get into any smartphone or tablet they could. With stiff competition from Qualcomm and other chipset vendors, they’ve found that task to be very difficult. They credit their hard hurdles to MediaTek even more, because MediaTek’s value-positioned platform wins out for many mid-level or small OEMs.
So NVIDIA’s calling it quits… somewhat. In a recent interview, NVIDIA CEO Jen-Hsun Huang talked about their struggles in the market so far and what they’re doing to adapt. For starters, he says they realize that competing for the “mainstream” smartphone and tablet market is no longer a desire for them.
http://phandroid.com/2014/05/22/nvidia-ceo-interview/
Really? I heard their was a Tegra 5 ( The 192 CUDA-core Tegra K1) coming...
http://www.theverge.com/2014/1/5/5278206/nvidia-debuts-tegra-k1-192-core-processor
Nvidia's new processor is the latest in the Tegra family, succeeding last year's Tegra 4. This processor now puts them in the same camp as Intel. They claim it has more raw computing power than the Playstation 4.
The Tegra K1 A15 variant will max out at 2.3GHz, while the Denver version will max out at 2.5GHz. The former is expected to hit devices in the first half of this year, while the latter will hit in the second half.
the K1 is offered in two versions: the first is a 32-bit quad-core ARM Cortex A15 processor, similar to the Tegra 4 but more efficient.*According to an Nvidia whitepaper (PDF),*it can use half the power for the same CPU performance, or get 40 percent more performance for the same power. The second variant is a long-awaited custom 64-bit dual-core "Denver" ARM CPU, Huang spoke at great length to demonstrate the K1's graphical capabilities, showing it capably render Unreal Engine 4:
Click to expand...
Click to collapse
So is it really dead?
It will also power Google's Project Tango Tablet
http://www.forbes.com/sites/greatsp...a-k1-powers-googles-project-tango-tablet-kit/
Sent from my Tablet using Tapatalk
system.img said:
Tegra 4 has better graphics and probably better optimised games than the Mali on the Exynos. CPU wise, I think the CPU on the Exynos is slightly better.
Click to expand...
Click to collapse
What's the reasoning behind that? There are more Mali GPU devices out there so won't it be a bigger focus for app developers?
---------- Post added at 12:12 AM ---------- Previous post was at 12:11 AM ----------
xRevilatioNx said:
Really? I heard their was a Tegra 5 ( The 192 CUDA-core Tegra K1) coming...
http://www.theverge.com/2014/1/5/5278206/nvidia-debuts-tegra-k1-192-core-processor
So is it really dead?
It will also power Google's Project Tango Tablet
http://www.forbes.com/sites/greatsp...a-k1-powers-googles-project-tango-tablet-kit/
Sent from my Tablet using Tapatalk
Click to expand...
Click to collapse
Yeah K1 is already out. It is in Xiaomi's new tablet.
I've now just read that chip is going to "revolutionize gaming" and will power 4K easily.
The K1, along with the new Unreal Engine 4 will, Huang promised, bring "Next-Gen" "Photo-Real" gaming to tablets and mobile phones. "Unreal Engine is the most successful game engine of all time,"
According to Huang, theTegra K1 mobile CPU offers almost 3x the performance of Apple's A7 Chips.
Click to expand...
Click to collapse
http://mashable.com/2014/01/06/nvidia-tegra-k1/
Now I'm afraid to even pull the trigger on the Asus or Samsung offerings. Not until I see where this chip lands, in future devices, and how they spec out.
Sent from my Tablet using Tapatalk
xRevilatioNx said:
So is it really dead?
Click to expand...
Click to collapse
[Q] CNET - You delayed Tegra 4 for Tegra 4i. Did that turn out to be a mistake? Did you miss this whole design cycle?
[A] Nvidia's CEO: I would say that Tegra 4i didn't pan out. We learned a lot in the process. But there are many things in our company that didn't pan out. That's OK. If you want to be an innovative company, you have to fail. Look, we built a great chip. LG's shipping it in the rest of the world outside the United States. It's a fantastic processor. But from a business strategy, it wasn't a success. So I learned a lot from it. It's OK. I'm glad I did it, and now we're moving on.
[Q] CNET: Why did Tegra struggle in smartphones?
[A:] Nvidia's CEO: Our focus as a company is still performance-oriented. The mainstream phone market commoditized so fast that really the...differentiators were price. And you can see the pressure that MediaTek is putting on Qualcomm, and you can see the pressure that MediaTek is putting on Marvell and Broadcom and all of these companies. Because guess what? They're the lowest-cost provider. I think that for mainstream phones, there's one strategy that really works right now, which is price. That's not our differentiator. That's not what we do for a living.
Sounds dead to me; at least in mainstream tablets and smartphones. Who's going to use it anyway? Qualcomm, Samsung, and MediaTek have better scale and produce equal or better chips so who needs Nvidia if they aren't price competitive?
This really isn't a relevant conversation for the SGS5 forum anyway. The Tab S' are still using S-800 so it appears all of Samsung's tablets starting with the N10.1-14 are using the same h/w which means Exynos 5420 for the Tab S which doesn't have HMP enabled where the 5422 in the SGS5 does. The display area and sheer amount of pixels in Samsung's 2560x1600 tablets also make this an irrelevant comparison. There's like 10 people in the TF701 forum and it's been marked down everywhere to $299. Between Nvidia's and the TF701's position I'd say its day in the sun has passed. If it every had one.
Barry
They aren't leaving the market. They just don't want to be mainstream. In order to do do they would have to be cost efficient as well. That's not what Nvidia is about.
You left this tidbit out..
So NVIDIA’s calling it quits… somewhat . In a recent interview, NVIDIA*CEO Jen-Hsun Huang talked about their struggles in the market so far and what they’re doing to adapt.
“Mainstream" could mean a lot of different things, but it sounds like he’s talking about every other chipset vendor’s need to hit every price point there is. He doesn’t want the Tegra brand to conform to something they don’t want it to be — their belief is that Tegra is a powerful line, and they don’t want to sacrifice that standard of power for the sake of creating more cost-efficient chipsets .
Click to expand...
Click to collapse
Lastly, as another poster said, your seeing their latest chip by Nvidia rolling out in tablets. So as I said before, I may wait it out. See what other manufacturers this chipset pops up in, and how the rest of the tablet specs out. 4K capability is also a plus, along with be the best gaming tablet on the market, with that chipset.
Barry. Thank you for your opinion. If you were a gamer like me. Who also enjoys multimedia consumption, at the highest quality. What would you do?
Edit: Barry, disregard my quote. I see you have it up there and now realize your rationalization. So you don't expect it showing up (in any prominent fashion) , in the tablet market.
Sent from my Tablet using Tapatalk

Categories

Resources