I'm very interested in the potential gaming performances of the upcoming Motorola devices, in light of the fact many people are less than excited for their releases due to their non cutting-edge specs.
The Droid MAXX, Droid Ultra and Droid Mini are all confirmed to be equipped with Motorola's new X8 Mobile Computing System, and the Motorola X is almost guaranteed to come with it as well.
The X8 consists of 8 cores, and they are categorized as:
- 2 application processor cores
- 4 graphics processor cores
- 1 contextual computing processor core
- 1 natural processor core
The overall X8 MCS is exciting for a variety of reasons, with the main one being that because of the highly custom and optimized system, the battery will enjoy a heavily increased lifespan. The gaming capabilities, however, are of course the main focus of this thread.
This is what Motorola has to say about the graphics processor cores:
"Four powerful graphics processors each running at 400 MHz delivering 3.2 million pixel fill rate,16 shader units, 512kb dedicated cached memory and running the Egypt performance benchmark at a blazing 155 frames per second (FPS). Fully compliant with Android Project Butter."
Being (apparently) based off the Qualcomm Snapdragon S4 Pro, the X8 comes with an Adreno 320 GPU.
For a comparison, here are the results of other comparable devices tackling the Egypt GLBenchmark 2.5:
Galaxy S4 - 40 FPS; HTC One - 32 FPS; Optimus G Pro - 27 FPS; Nexus 4 - 44 FPS.
Just ignore the 1080p and 720p aspects here. True, the Nexus 4 cruises through thanks to its 720p display requiring less work. However, this only serves to enhance the X8's power; while it has an Adreno 320 just like all the devices listed above except for the S4, it completely blows them out of the water - all the upcoming Motorola devices will have a 720p display. I know we're talking on a synthetic level here and that real world performance can differ, but surely this is too big a difference to brush aside like that?
What do you guys think? Is there really more to the Motorola X and and new Droid lineup than meets the eye, for those who look at it purely from a specs perspective?
For all the information, this is the link I used: http://www.androidauthority.com/mot...guage-processing-contextual-computing-247346/
Yeah, the thing about upcoming Motorola phones is, that their graphic chips are not midrange at all. Based on benchmarks they are elite. That's similar to Apple strategy - people say their specs are nothing special, but in fact their graphics is far superior compared to Android devices.
Sounds something like Amiga did in the 90's with dedicated chips for various things, alleviating the need for hefty CPU's as the load was shared between the specialist chips - watching the news/reviews on this with interest.
what about the battery life
I think it needs 5000mAh battery:laugh:
ASMI1 said:
what about the battery life
I think it needs 5000mAh battery:laugh:
Click to expand...
Click to collapse
Hopefully the battery won't be too much of a problem. The Nexus 4's battery life is probably its weakest point (2100mAh), and the Moto X (as an example) is speculated to be coming with a 2200 mAh battery, and also has a 4.7 inch 720p display. Bear in mind, however, that Moto X has the custom X8 system optimised for battery life, and has a dual-core CPU as opposed to quad-core.
Beplexor said:
Hopefully the battery won't be too much of a problem. The Nexus 4's battery life is probably its weakest point (2100mAh), and the Moto X (as an example) is speculated to be coming with a 2200 mAh battery, and also has a 4.7 inch 720p display. Bear in mind, however, that Moto X has the custom X8 system optimised for battery life, and has a dual-core CPU as opposed to quad-core.
Click to expand...
Click to collapse
Yup, but I think with an app like greenify the battery life won't be a problem anymore (or you can reduce the screen density, this help too much)
One thing i've been trying to figure out: Does it come with 4 adreno 320's? Because I have not found any sources that say 1 adreno 320 has 4 cores. Does anyone have a source for this info?
Beplexor said:
I'm very interested in the potential gaming performances of the upcoming Motorola devices, in light of the fact many people are less than excited for their releases due to their non cutting-edge specs.
The Droid MAXX, Droid Ultra and Droid Mini are all confirmed to be equipped with Motorola's new X8 Mobile Computing System, and the Motorola X is almost guaranteed to come with it as well.
The X8 consists of 8 cores, and they are categorized as:
- 2 application processor cores
- 4 graphics processor cores
- 1 contextual computing processor core
- 1 natural processor core
The overall X8 MCS is exciting for a variety of reasons, with the main one being that because of the highly custom and optimized system, the battery will enjoy a heavily increased lifespan. The gaming capabilities, however, are of course the main focus of this thread.
This is what Motorola has to say about the graphics processor cores:
"Four powerful graphics processors each running at 400 MHz delivering 3.2 million pixel fill rate,16 shader units, 512kb dedicated cached memory and running the Egypt performance benchmark at a blazing 155 frames per second (FPS). Fully compliant with Android Project Butter."
Being (apparently) based off the Qualcomm Snapdragon S4 Pro, the X8 comes with an Adreno 320 GPU.
For a comparison, here are the results of other comparable devices tackling the Egypt GLBenchmark 2.5:
Galaxy S4 - 40 FPS; HTC One - 32 FPS; Optimus G Pro - 27 FPS; Nexus 4 - 44 FPS.
Just ignore the 1080p and 720p aspects here. True, the Nexus 4 cruises through thanks to its 720p display requiring less work. However, this only serves to enhance the X8's power; while it has an Adreno 320 just like all the devices listed above except for the S4, it completely blows them out of the water - all the upcoming Motorola devices will have a 720p display. I know we're talking on a synthetic level here and that real world performance can differ, but surely this is too big a difference to brush aside like that?
What do you guys think? Is there really more to the Motorola X and and new Droid lineup than meets the eye, for those who look at it purely from a specs perspective?
For all the information, this is the link I used: http://www.androidauthority.com/mot...guage-processing-contextual-computing-247346/
Click to expand...
Click to collapse
Independent benchmarks of the Moto X (same X8) indicate that the "155 fps egpt benchmark" score has a typo. Specifically a freaking addition of a 1.
The moto X got 55 fps on that test. Sure it's enough to beat the S4 and One on graphics specs, but that's still false freaking advertising. Those are old phones (in smartphones 6 months is a whole product cycle).
Guess all the people *****ing about specs were correct.
here's the benchmarks. pathetic. http://arstechnica.com/gadgets/2013...he-moto-x-sports-a-great-gpu-respectable-cpu/
Related
When I read about the upcoming quad-core tablets a few months ago, I was thrilled - especially when the Transformer Prime hit the news. But then, after reading a lot more news, I began to hesitate. Here's some of my thoughts (in terms of hardware):
1) CPU: I recognized that the biggest differences between Tegra 3 and Tegra 2 is just the quantity. So I think that when other manufacturers (Samsung, Qualcomm, TI, etc.) release their quad-core chips, the Tegra 3 will mostly stay behind in the performance battlefield.
2) RAM: 1GB is now standard and enough to use, but when MS release Windows 8, and I think many people would like to install it on their tablets, it is unsure whether 1GB of RAM is enough or not. Better be prepared with 2GB. Also, when I decided to buy a tablet, I intended to use it for at least 3 or 4 years. Who knows if the future Android OS recommend 2GB of RAM to run smooth?
3) GPU: Same as CPU, I'm looking forward to see how powerful the new ones are (especially the ones from Imagination Technologies).
4) Display: Since the iPhone 4, the high-res trend has risen in the phone arena, and in 2012 it'll happen with tablets. We all like to see a smooth, detailed graphics from our tablets, aren't we?
Feel free to criticize me, I will be very appreciated.
go to anandtech for reviev.i have the same problem...lets waith...
inviato da sgs2
The Galaxy S 2 for sure slaughtered the Tegra 2 in benchmark numbers, but in actuality, Tegra 2 has the best dev kit Android has to offer and offers a better gaming experience.
The results are that THD (Tegra HD) enhanced games are vastly superior to the Galaxy S 2 running the standard Android version, so in my eyes, Tegra 2 won in the department of real world performance. I expect gpu acceleration from ICS to be that much better on Tegra 2 devices.
The Tegra 3 is the latest and most powerful hardware available now, and unless you're European, you'll be waiting a full year to get Samsung's next offering in which Tegra 4 will be right around the corner by the time it actualizes in our market.
Sent from my Epic 4G using Tapatalk
WAiting for quadqore..
Sent from my GT-S5570
Dsparil said:
The Galaxy S 2 for sure slaughtered the Tegra 2 in benchmark numbers, but in actuality, Tegra 2 has the best dev kit Android has to offer and offers a better gaming experience.
The results are that THD (Tegra HD) enhanced games are vastly superior to the Galaxy S 2 running the standard Android version, so in my eyes, Tegra 2 won in the department of real world performance. I expect gpu acceleration from ICS to be that much better on Tegra 2 devices.
The Tegra 3 is the latest and most powerful hardware available now, and unless you're European, you'll be waiting a full year to get Samsung's next offering in which Tegra 4 will be right around the corner by the time it actualizes in our market.
Sent from my Epic 4G using Tapatalk
Click to expand...
Click to collapse
Indeed Tegra 2 is quite good in games, however I still think that the PowerVR SGX GPU series are the most powerful. But why didn't the manufacturers use them in their SoC (except TI with the old SGX540)?
Although Samsung produced good quality SoC, if other competitors can release new chips sooner with the same performances then we will have other choices. So let's wait for the full wave of quad-core chips and then we can decide.
I think the big difference will be the manufacturing process of the new snapdragon processor, it could really increase the battery life. But, as the engadget review of the transformer prime says, it has 10 h of battery life, not bad for a 5 core.
Elwood_It said:
I think the big difference will be the manufacturing process of the new snapdragon processor, it could really increase the battery life. But, as the engadget review of the transformer prime says, it has 10 h of battery life, not bad for a 5 core.
Click to expand...
Click to collapse
It's obvious that in the Engadget battery test, there are times when their usage wasn't too heavy, and that was when the fith low-power core performed.
.
Thread moved to Q&A due to it being a question. Would advise you to read forum rules and post in correct section.
Failure to comply with forum rules will result in an infraction and/or ban depending on severity of rule break.
Wait....
Please list which 5 upcoming android phones interest you the most. I compiled a list of possibilities below, but other new phones may be accepted. Discuss why if you want, add to the pros and cons.
Galaxy S3:
Pros – Exynos quad core 32nm processor that’s LTE capable, Ceramic casing, bezel-less design, ICS (rumoured to be stock vanilla not TW) 4.6+"large screen.
Cons – It’s all one big rumour for now. Plus will likely have TW.
HTC One X:
Pros – Tegra3 quin core, large 4.7” screen, 32GB storage plus 23GB dropbox, Bluetooth 4, 8MP 1080p recording. ICS, Available Now. Excellent Camera features.
Cons – No micro sd support, look of camera, heard dropbox only valid for 2 years. Debatable whether sense 4 falls in here.
LG 4x HD:
Pros – Tegra3 quin core, large 4.7” screen, 8MP camera w/1080p recording, good battery 2140mAh, NFC, Bluetooth 4, micro sd card ICS
Cons –MWC video demo seemed laggy
Asus Pad phone:
Pros – Versatile new phone concept that can fit in a tablet, dual core 1.5 GHz Krait a15 28nm processor, desirable 4.3 screen . Q2 release.
Cons – Possibly a pricy combo.
Droid Fighter:
Pros - Large 3300mAh battery, big 4.6 MP 720p screen, LTE, Bluetooth 4, ICS
Cons – possible rumour, nothing really known. Motoblur.
Motorola Atrix 3:
Pros – Tegra3 quin core, large 3300mAh battery, desirable 4.3” 720p screen, 10 MP camera. ICS
Cons – Almost all based on rumour with very little confirmed. Motoblur.
Fujitsu F12arc:
Pros – Tegra3 quin processor, highly water and dust resistant, Big 4.6” screen, 13 MP camera , 4GLTE (??? Thought it was incompatible with tegra 3 so far.)ICS
Cons – Unknown battery life, still ironing out ICS development glitches, maybe Q3 release or later.
Panasonic Eluga Power:
Pros - A fairly good processor 1.5GHz dual core Snapdragon S4 processor, Big 5” screen, water and dust proof, 8 MP camera, Fast Charge, ICS
Cons –Mediocre battery 1800 mAh, global launch details still speculated.
Hauwei Ascend D Quad:
Pros – Fast quad core, lovely 4.5” screen 1080p recording, good size 2500mAh battery ICS
Cons – Quality unproven, GPU unknown, might be a bit late to the game Q3-Q4
Because I'm on att, I would have to pick Galaxy s3, atrix 3, HTC one x, LG 4x HD, and Asus padfone in that order. nice gathering of all the rumors btw!
Sent from my MB860 using XDA
Those are good choices. Tegra 3 is lovely and I hold a lot of respect for it, but the Exynos really intrigues me and 32nm will likely provide good power efficiency.
My list is similar Galaxy S3, Atrix 3, Fujitsu F12arc, HTC One X, Asus Padfone. Although the LG4x HD could easily be swapped for HTC One X. Optus didn't get any of the Atrix series though so I have to see what ends up here in Australia.
Im eager to see the eluga power.
Some more pros and cons for it:
Pro: Small form factor: As big as a galaxy nexus but with 5" Display, On Screen Buttons
Con: apparantly no LED Flash, no MHL or other video out, fixed battery
I have to add, that the device is ip57 certified - that means it is actually waterproof not only water resistant.
shorty66 said:
Im eager to see the eluga power.
Some more pros and cons for it:
Pro: Small form factor: As big as a galaxy nexus but with 5" Display, On Screen Buttons
Con: apparantly no LED Flash, no MHL or other video out, fixed battery
I have to add, that the device is ip57 certified - that means it is actually waterproof not only water resistant.
Click to expand...
Click to collapse
Good to see others adding additional info.
So voting for a tougher version, water proof version of the galaxy note with what should be a greater a15 version duo core s4 snapdragon processor is a great choice. A processor that is suppose to stand toe to toe with tegra 3. Or surpass it if certain benchmarks are to be believed.
The waterproof phones (I believed the F12arc also says waterproof, I don't list until I know it is) are now getting competitive processors, unlike the defy. Phones with great build and processors should always rank high on our lists. Glad to see someone voting that way. Just hope we get to see it. Aussie land hasn't seen the Razr Maxx yet, so I can only pray for the Eluga power or F12arc.
Seems with every smartphone that comes to the USA it gets some sort of Snapdragon Processor by Qualcomm and people do nothing but complain. So how does this Snapdragon S4 processor compare to every other dual-core processor out there and even the Tegra 3? Looked up some benchmarks and both seem to have their advantages and disadvantages. But what I really want to know is which one is better for real world performance, such as battery life, transitional effects, and launching apps. Couple people said Sense 4 is very smooth and "has LITTLE to no lag"? How does this processor display web pages in Chrome?
Read the thread "Those of your who are waiting too compare GSIII to HTC One X" in this forum. It only has about 6 pages but has a ton of information. Short answer is that the Qualcomm chip kicks serious ass.
Sent from my Desire HD using XDA
shaboobla said:
Short answer is that the Qualcomm chip kicks serious ass.
Sent from my Desire HD using XDA
Click to expand...
Click to collapse
+1
After reading through that thread I'm still not entirely clear. Seems the Tegra is better for gaming?
MattMJB0188 said:
After reading through that thread I'm still not entirely clear. Seems the Tegra is better for gaming?
Click to expand...
Click to collapse
yes and no, the tegra 3 does have a better gpu so in theory, better games. however, game makers cater to the mass. most androids that are active are mid-range, android 2.2 or 2.3, have a resolution of 480x800, and last years (or older) processors. although most will be made to work on the t3 and s4, it will be compatibility issues, not optimization. nvidia will have a couple games "t3 only" but even those will be made to work on other phones. now that ics is cleaning up some of the splintering of apps, we'll see some better options on both fields.
in short, yes the t3 is a better gaming chip. but for the battery life, games available, and current bugs i would suggest the s4. i may change my mind when the refreshs come out q3-4, we'll see.
MattMJB0188 said:
After reading through that thread I'm still not entirely clear. Seems the Tegra is better for gaming?
Click to expand...
Click to collapse
Correct. However, most games are not optimized to utilize the Tegra to its fullest potential. That should change by the end of the year. The other point is that the S4 is just as good as the Tegra un terms of gaming performance. IMO, you should decide between these 2 processors by looking at the main area where the S4 truly has the advantage thus far, and that is battery life. So far, the battery life advantage goes to the S4. Just read the battery life threads in this forum and for the international X. It took a few updates to the Transformer Prime to start having pretty good battery life. The One X, will get better in that department with a couple more updates for battery optimization. The S4 starts with great battery life and will get even better in that department.
Sent from my HTC Vivid using XDA app
I say the snapdragon S4 is a better chip right now. The tegra 3 gpu is great and with the tegra zone games it really looks great. But he 4 cores CPU is really for heavy multitasking so you candivise the work between all four cores. They are A9 cores vs the custom qualcomm which is close to A15. It mans that for single threaded task and multi threaded task the snapdragon will whoop tegra 3' ass. Opening an app, scrolling through that app sect... also browser performance is slightly better on the qualcomm chip. Basically tegra 3 can do lots of things at the same time with decent speed vs the S4 chip which can do 1 or few more things at lighting speed.
The S4 is almost 2x faster than any other dual core out there. Anandtech did a few nice articles on the S4, including benchmarks vs tegra 3.
In real use, the S4 should be much better, because not all apps are multithreaded for 4 cores. The S4 completely kicks the Tegra 3's ass in singlethreaded benchmarks. I also expect the S4 to be better at power management, because it is made on 28nm node, instead of 40 nm, so its more compact and efficient.
About 23 I'd say
Sent from my SGH-I997 using xda premium
Here is a comparison benchmark by someone from Reddit.
Benchmark S4 Krait Tegra 3
Quadrant 5016 4906
Linpack Single 103.11 48.54
Linpack Multi 212.96 150.54
Nenamark 2 59.7fps 47.6fps
Nenamark 1 59.9fps 59.5fps
Vellamo 2276 1617
SunSpider 1540.0ms 1772.5ms
Sadly, can't do much for the formatting. Enjoy.
The difference in DMIP's is where the S4 really whomps on the T3. All the T3 has going for it at the moment is it's GPU. If you don't care about some additional gaming prowess, the S4 is the way to go.
tehdef said:
Here is a comparison benchmark by someone from Reddit.
Benchmark S4 Krait Tegra 3
Quadrant 5016 4906
Linpack Single 103.11 48.54
Linpack Multi 212.96 150.54
Nenamark 2 59.7fps 47.6fps
Nenamark 1 59.9fps 59.5fps
Vellamo 2276 1617
SunSpider 1540.0ms 1772.5ms
Sadly, can't do much for the formatting. Enjoy.
The difference in DMIP's is where the S4 really whomps on the T3. All the T3 has going for it at the moment is it's GPU. If you don't care about some additional gaming prowess, the S4 is the way to go.
Click to expand...
Click to collapse
Just to add to that and to be fair, S4 is at around 7000 at antutu benchmark while tegra 3 is at around 10000. I still prefer the S4
Eh...
It wins in 1 benchmark specifically enabled to take advantage of more than 2 cores. So if you want to play tegrazone games and have some basic lag, the T3 is for you. If you want to have a near flawless phone experience, and have decreased graphical performance in some wanna be console games, then the S4 is the way to go.
Actually you wont really notice the lack of graphics performance on the snapdragon s4. Its about 10% slower in most benchmarks but outperforms the tegra3 in a few as well. However i have a sensation xl with the adreno 205 which is only a quarter as fast as the adreno 225 and all games including deadspace, frontline, blood glory runs smoothly on it. To say the snapdragon s4 is inferior because of the slower Adreno 225 is really nit picking to me. For me bigger reason to choose one graphics chip over another is flash performance and this is where the exynos mali 400 kicks the adreno 225 in the balls. It handles 1080p youtube videos in browser without a hiccup while the 225 chokes even on 720p content.
Let me answer this. How good is it? More than good enough. Almost all apps and games are catered to weaker phones so the T3 and S4 are both more than good enough.
And my two cents, the S4 beats tegra 3
MattMJB0188 said:
Seems with every smartphone that comes to the USA it gets some sort of Snapdragon Processor by Qualcomm and people do nothing but complain. So how does this Snapdragon S4 processor compare to every other dual-core processor out there and even the Tegra 3? Looked up some benchmarks and both seem to have their advantages and disadvantages. But what I really want to know is which one is better for real world performance, such as battery life, transitional effects, and launching apps. Couple people said Sense 4 is very smooth and "has LITTLE to no lag"? How does this processor display web pages in Chrome?
Click to expand...
Click to collapse
Let me start by saying I'm not a pro when it comes to electronics but I do have an understanding on the subject.
The thing to realize about these processors, and most other processors available today, is that the s4 is based on the cortex a15 while the tegra 3 along with the new Samsung are based on the a9. The a15, at the same Hz and die size is 40% faster than the a9.
S4 = dual core Cortex A15 @ 1.5GHz - 28NM
Tegra3 = quad core Cortex A9 @ 1.5GHz - 40NM
Exynos 4(Samsung) = quad core Cortex A9 @ 1.5GHz - 32NM
S4 so far, in theory, is 40% faster per core, but having two less. Individual apps will run faster unless they utilize all four cores on the tegra3. Because the s4 has a smaller die size, it will consume less energy per core.
The actual technology behind these chips that the manufacturers come up with will also affect the performance output, but the general idea is there. Hope that helps to understand a little better how the two chips will differ in performance.
Sent from my shiny One XL
The S4 compared to the Tegra3 says it all. dualcore that beats a quadcore in almost everything.
Intel released the first native dual core processor in 2006 and shortly thereafter released a quad core which was basically two dual cores fused together (this is what current ARM quads are like).
That was 6 years ago and these days pretty much all new desktop computers come with quad cores while laptops mostly stick with dual. Laptops make up the biggest share of PC sales so for your everyday PC usage, you'll be more than comfortable with a dual core.
You really can't assume mobile SoCs will follow the same path, but it's definitely something to consider. I think dual core A15-based SoCs will still rule the day this year and next at the very least.
I was really on the fence about the X or the XL. But the S4 got me. Not having 32GB is already bugging me. But the efficiency (and my grandfathered unlimited data paired with Google Music) is definitely worth the sacrifice. Very happy so far! Streaming Slacker, while connected to my A2DP stereo, running GPS was great. I'm not a huge gamer though. I miss Super Mario Bros being the hottest thing!
krepler said:
Let me start by saying I'm not a pro when it comes to electronics but I do have an understanding on the subject.
The thing to realize about these processors, and most other processors available today, is that the s4 is based on the cortex a15 while the tegra 3 along with the new Samsung are based on the a9. The a15, at the same Hz and die size is 40% faster than the a9.
S4 = dual core Cortex A15 @ 1.5GHz - 28NM
Tegra3 = quad core Cortex A9 @ 1.5GHz - 40NM
Exynos 4(Samsung) = quad core Cortex A9 @ 1.5GHz - 32NM
S4 so far, in theory, is 40% faster per core, but having two less. Individual apps will run faster unless they utilize all four cores on the tegra3. Because the s4 has a smaller die size, it will consume less energy per core.
The actual technology behind these chips that the manufacturers come up with will also affect the performance output, but the general idea is there. Hope that helps to understand a little better how the two chips will differ in performance.
Sent from my shiny One XL
Click to expand...
Click to collapse
correct me if im wrong but all 3 are A9 based including the S4. the first A15 will be the Exynos 5250, a dual core.
Tankmetal said:
correct me if im wrong but all 3 are A9 based including the S4. the first A15 will be the Exynos 5250, a dual core.
Click to expand...
Click to collapse
This is inaccurate.
The Exynos 4 and the Tegra 3 are based on the ARM A9 reference design.
The Qualcomm Snapdragon S4 is "roughly equivalent" to the A15, but not based on the A15. The same was true for Qualcomm's old S3 (which was equivalent to something between the A8 and A9 design)
One thing that most people don't realize is that Qualcomm is one of the very few companies that designs its own processors based on the ARM instruction set, and while S4's is similar to the A15 in terms of architecture, it's actually arguably better than the ARM reference design (e.g. asynchronous clocking of each core which is a better design than the big.LITTLE or +1 design).
Just a harmless conversation based on what the next Samsung flagship phone might receive based on trends of the mobile industry. You can offer your own opinions and/or critiques.
With the Nexus 4, sporting a very powerful S4 Snapdragon quad processor, 320 Adreno GPU, 2 GB of RAM and the standard HD display, it's sure to be the strongest thing we have until the HTC Butterfly (now dubbed the HTC DNA) comes (reportedly to Verizon) sporting a 1080p screen, probably the same processor and RAM, as well as a bigger battery.
It's clear that the competition within the Android ecosystem isn't as monopolized by Samsung like it was a year or two ago - at least in terms of quality products. The Galaxy S II stayed not only as the top Android phone, but the top phone period for almost a whole year. It seems that HTC and LG have stepped up their game and are putting out functioning, competitive products. The Galaxy S III kind of fell off the performance radar within 4-5 months. (comparatively speaking)
Samsung has been reportedly testing 3 GB RAM on experiment phone models; if that made it to the final product of the GSIV, that would make it a trend-setter in that regard. It's also going to come with the new-generation 13 megapixel camera module, which is slated to desolate any popular 8 MP camera shooters on the market - such as the HTC One X, Galaxy S III (the worst of the bunch on the front-facing camera) or iPhone 5 with its color-reproduction and low-light performance.
As for the processor, I hope they throw in the ultra-powerful Exynos 5450 (the Cortex-A15 quad-core) rated at around 2 GHz, it will absolutely eat the already unbelievably fast S4 Krait quad in the upcoming megaphones for breakfast. The Mali-658 GPU from the aforementioned would probably also be on par, if not better than that in the iPad 4's A6X's GPU. Last but not least, anyone else hoping we'll see a beautiful Super AMOLED HD Plus display (or whatever they'll call it) with full 1080p?
All of these rumors and speculation sound reasonable at this point, given what we've been seeing in the market place. If it all turned out to be true, the Galaxy S IV would be leaps and bounds above any smartphone upon its release and maybe easily throughout the year subsequent to its release. I'm actually excited that the competition within Android is picking up steam. The harder these companies compete against each other, the more us, as consumers win.
In the next generation phones, most of them would be compatible with nfc payments, like Google wallet or isis.
I expect faster processor and a minimum of 2 gb of ram.
Most likely, every phone will come in different screen sizes. So you can choose a particular phone with your choice of screen size.
Sent from my SAMSUNG-SGH-I747 using xda app-developers app
2gb ram
nvidia tegra 3 graphic chipset
5.0inch screen super amoled screen 12mp camera,with burst mode
and maybe some new technolgy
deaddrg said:
2gb ram
nvidia tegra 3 graphic chipset
5.0inch screen super amoled screen 12mp camera,with burst mode
and maybe some new technolgy
Click to expand...
Click to collapse
Why would they use Tegra 3 chips - which are only Cortex-A9's and the fact that Samsung develops their own chipset means they would opt for their own before another chipset commonly found in its competitors.
deaddrg said:
2gb ram
nvidia tegra 3 graphic chipset
5.0inch screen super amoled screen 12mp camera,with burst mode
and maybe some new technolgy
Click to expand...
Click to collapse
Mhmmm future
megagodx said:
Why would they use Tegra 3 chips - which are only Cortex-A9's and the fact that Samsung develops their own chipset means they would opt for their own before another chipset commonly found in its competitors.
Click to expand...
Click to collapse
I sure hope they don't use the tegra chipset. That'll make them depend on nVidia for updates.
more batery life
It's great that a lot of attention is put into adding better specs (RAM, CPU, features like NFC, etc), but my biggest concern is that they' don't pay enough attention to battery life. I don't understand why they don't invest more in this field.
Here's to hoping that Samsung will care more about this and innovate in this area.
Who has been excited by the Tegra 4 rumours?, last night's Nvidia CES announcement was good, but what we really want are cold-hard BENCHMARKS.
I found an interesting mention of Tegra T114 SoC on a Linux Kernel site, which I've never heard of. I got really interested when it stated that the SoC is based on ARM A15 MP, it must be Tegra 4. I checked the background of the person who posted the kernel patch, he is a senior Nvidia Kernel engineer based in Finland.
https://lkml.org/lkml/2012/12/20/99
"This patchset adds initial support for the NVIDIA's new Tegra 114
SoC (T114) based on the ARM Cortex-A15 MP. It has the minimal support
to allow the kernel to boot up into shell console. This can be used as
a basis for adding other device drivers for this SoC. Currently there
are 2 evaluation boards available, "Dalmore" and "Pluto"."
On the off chance I decided to search www.glbenchmark.com for the 2 board names, Dalmore (a tasty whisky!) and Pluto (Planet, Greek God and cartoon dog!) Pluto returned nothing, but Dalmore returned a device called 'Dalmore Dalmore' that was posted on 3rd January 2013. However the OP had already deleted them, but thanks to Google Cache I found the results
RESULTS
GL_VENDOR NVIDIA Corporation
GL_VERSION OpenGL ES 2.0 17.01235
GL_RENDERER NVIDIA Tegra
From System spec, It runs Android 4.2.1, a Min frequency of 51 MHz and Max of 1836 Mhz
Nvidia DALMORE
GLBenchmark 2.5 Egypt HD C24Z16 - Offscreen (1080p) : 32.6 fps
iPad 4
GLBenchmark 2.5 Egypt HD C24Z16 - Offscreen (1080p): 49.6 fps
CONCLUSION
Anandtech has posted that Tegra 4 doesn't use unified shaders, so it's not based on Kepler. I reckon that if Nvidia had a brand new GPU they would have shouted about it at CES, the results I've found indicate that Tegra 4 is between 1 to 3 times faster than Tegra 3.
BUT, this is not 100% guaranteed to be a Tegra 4 system, but the evidence is strong that it is a T4 development board. If this is correct, we have to figure that it is running beta drivers, Nexus 10 is ~ 10% faster than the Arndale dev board with the same Exynos 5250 SoC. Even if Tegra 4 gets better drivers, it seems like the SGX 544 MP4 in the A6X is still the faster GPU, with Tegra 4 and Mali T604 being an almost equal 2nd. Nvidia has said that T4 is faster than A6X, but the devil is in the detail, in CPU benchmarks I can see that being true, but not for graphics.
UPDATE - Just to add to the feeling that that this legit, the GLBenchmark - System section lists the "android.os.Build.USER" as buildbrain. Buildbrain according to a Nvidia job posting is "Buildbrain is a mission-critical, multi-tier distributed computing system that performs mobile builds and automated tests each day, enabling NVIDIA's high performance development teams across the globe to develop and deliver NVIDIA's mobile product line"
http://jobsearch.naukri.com/job-lis...INEER-Nvidia-Corporation--2-to-4-130812500024
I posted the webcache links to GLBenchmark pages below, if they disappear from cache, I've saved a copy of the webpages, which I can upload, Enjoy
GL BENCHMARK - High Level
http://webcache.googleusercontent.c...p?D=Dalmore+Dalmore+&cd=1&hl=en&ct=clnk&gl=uk
GL BENCHMARK - Low Level
http://webcache.googleusercontent.c...e&testgroup=lowlevel&cd=1&hl=en&ct=clnk&gl=uk
GL BENCHMARK - GL CONFIG
http://webcache.googleusercontent.c...Dalmore&testgroup=gl&cd=1&hl=en&ct=clnk&gl=uk
GL BENCHMARK - EGL CONFIG
http://webcache.googleusercontent.c...almore&testgroup=egl&cd=1&hl=en&ct=clnk&gl=uk
GL BENCHMARK - SYSTEM
http://webcache.googleusercontent.c...ore&testgroup=system&cd=1&hl=en&ct=clnk&gl=uk
OFFSCREEN RESULTS
http://webcache.googleusercontent.c...enchmark.com+dalmore&cd=4&hl=en&ct=clnk&gl=uk
http://www.anandtech.com/show/6550/...00-5th-core-is-a15-28nm-hpm-ue-category-3-lte
Is there any Gpu that could outperform iPad4 before iPad5 comes out? adreno 320, t Mali 604 now tegra 4 aren't near it. Qualcomm won't release anything till q4 I guess, and tegra 4 has released too only thing that is left is I guess is t Mali 658 coming with exynos 5450 (doubtfully when it would release, not sure it will be better )
Looks like apple will hold the crown in future too .
i9100g user said:
Is there any Gpu that could outperform iPad4 before iPad5 comes out? adreno 320, t Mali 604 now tegra 4 aren't near it. Qualcomm won't release anything till q4 I guess, and tegra 4 has released too only thing that is left is I guess is t Mali 658 coming with exynos 5450 (doubtfully when it would release, not sure it will be better )
Looks like apple will hold the crown in future too .
Click to expand...
Click to collapse
There was a great article on Anandtech that tested the power consumption of the Nexus 10's Exynos 5250 SoC, it showed that both the CPU and GPU had a TDP of 4W, making a theoretical SoC TDP of 8W. However when the GPU was being stressed by running a game, they ran a CPU benchmark in the background, the SoC quickly went up to 8W, but the CPU was quickly throttled from 1.7 GHz to just 800 Mhz as the system tried to keep everything at 4W or below, this explained why the Nexus 10 didn't benchmark as well as we wished.
Back to the 5450 which should beat the A6X, trouble is it has double the CPU & GPU cores of the 5250 and is clocked higher, even on a more advanced 28nm process, which will lower power consumption I feel that system will often be throttled because of power and maybe heat concerns, so it looks amazing on paper but may disappoint in reality, and a 5450 in smartphone is going to suffer even more.
So why does Apple have an advantage?, well basically money, for a start mapple fans will pay more for their devices, so they afford to design a big SoC and big batteries that may not be profitable to other companies. Tegra 4 is listed as a 80mm2 chip, iPhone 5 is 96mm2 and A6X is 123mm2, Apple can pack more transistors and reap the GPU performance lead, also they chosen graphics supplier Imagination Technologies have excellent products, Power VR Rogue will only increase Apple's GPU lead. They now have their own chip design team, the benefit for them has been their Swift core is almost as powerful as ARM A15, but seems less power hungry, anyway Apple seems to be happy running slower CPUs compared to Android. Until an Android or WP8 or somebody can achieve Apple's margins they will be able to 'buy' their way to GPU domination, as an Android fan it makes me sad:crying:
32fps is no go...lets hope it's not final
hamdir said:
32fps is no go...lets hope it's not final
Click to expand...
Click to collapse
It needs to, but it will be OK for a new Nexus 7
still faster enough for me, I dont game alot on my nexus 7.
I know I'm taking about phones here ... But the iPhone 5 GPU and adreno 320 are very closely matched
Sent from my Nexus 4 using Tapatalk 2
italia0101 said:
I know I'm taking about phones here ... But the iPhone 5 GPU and adreno 320 are very closely matched
Sent from my Nexus 4 using Tapatalk 2
Click to expand...
Click to collapse
From what I remember the iPhone 5 and the new iPad wiped the floor with Nexus 4 and 10. The ST-Ericsson Nova A9600 is likely to have a PowerVR Rogue GPU. Just can't wait!!
adityak28 said:
From what I remember the iPhone 5 and the new iPad wiped the floor with Nexus 4 and 10. The ST-Ericsson Nova A9600 is likely to have a PowerVR Rogue GPU. Just can't wait!!
Click to expand...
Click to collapse
That isn't true , check glbenchmark , in the off screen test the iPhone scored 91 , the nexus 4 scored 88 ... That ksnt wiping my floors
Sent from my Nexus 10 using Tapatalk HD
Its interesting how even though nvidia chips arent the best we still get the best game graphics because of superior optimization through tegra zone. Not even the a6x is as fully optimized.
Sent from my SAMSUNG-SGH-I727 using xda premium
ian1 said:
Its interesting how even though nvidia chips arent the best we still get the best game graphics because of superior optimization through tegra zone. Not even the a6x is as fully optimized.
Sent from my SAMSUNG-SGH-I727 using xda premium
Click to expand...
Click to collapse
What sort of 'optimisation' do you mean? un optimised games lag that's a big letdown and tegra effects can also be used on other phones too with chain fire 3d I use it and tegra games work without lag with effects and I don't have a tegra device
With a tegra device I am restricted to optimised games mostly
The graphic performance of NVIDIA SoCs is always disappointed, sadly for the VGA dominanting provider on the world.
The first Tegra2, the GPU is a little bit better than SGX540 of GalaxyS a little bit in benchmark, but lacking NEON support.
The second one Tegra 3, the GPU is nearly the same as the old Mali400Mp4 in GALAXY S2/Original Note.
And now it's better but still nothing special and outperformed soon (Adreno 330 and next-gen Mali)
Strongest PowerVR GPUs are always the best, but sadly they are exclusive for Apple only (SGX543 and maybe SGX 554 also, only Sony ,who has the cross-licencing with Apple, has it in PS Vita and in PS Vita only)
tegra optimization porting no longer works using chainfire, this is now a myth
did u manage to try shadowgun thd, zombie driver or horn? the answer is no, games that use t3 sdk for physx and other cpu graphics works can not be forced to work on other devices, equally chainfire is now outdated and no longer updated
now about power vr they are only better in real multicore configuration which is only used by apple and Sony's vita, eating large die area, ie actual multicore each with its own subcores/shaders, if tegra was used in real multi core it would destroy all
finally this is really funny all this doom n gloom because of an early discarded development board benchmark, I dont mean to take away from turbo's thunder and his find but truly its ridiculous the amount of negativity its is collecting before any type of final device benchs
adrena 220 doubled in performance after the ICS update on sensation
t3 doubled the speed of t2 gpu with only 50% the number of shaders so how on earth do you believe only 2x the t3 scores with 600% more shaders!!
do you have any idea how miserable the ps3 performed in its early days? even new desktop GeForces perform much less than expected until the drivers are updated
enough with the FUD! seems this board is full of it nowadays and so little reasoning...
For goodness sake, this isn't final hardware, anything could change. Hung2900 knows nothing, what he stated isn't true. Samsung has licensed PowerVR, it isn't just stuck to Apple, just that Samsung prefers using ARMs GPU solution. Another thing I dislike is how everyone is comparing a GPU in the iPad 4 (SGX554MP4) that will NEVER arrive in a phone compared a Tegra 4 which will arrive in a phone. If you check OP link the benchmark was posted on the 3rd of January with different results (18fps then 33fps), so there is a chance it'll rival the iPad 4. I love Tegra as Nvidia is pushing developers to make more better games for Android compared to the 'geeks' *cough* who prefers benchmark results, whats the point of having a powerful GPU if the OEM isn't pushing developers to create enhance effect games for there chip.
Hamdir is correct about the GPUs, if Tegra 3 was around 50-80% faster than Tegra 2 with just 4 more cores, I can't really imagine it only being 2x faster than Tegra 3. Plus its a 28nm (at around 80mm2 just a bit bigger than Tegra 3, smaller than A6 90mm2) along with the dual memory than single on Tegra 2/3.
Turbotab said:
There was a great article on Anandtech that tested the power consumption of the Nexus 10's Exynos 5250 SoC, it showed that both the CPU and GPU had a TDP of 4W, making a theoretical SoC TDP of 8W. However when the GPU was being stressed by running a game, they ran a CPU benchmark in the background, the SoC quickly went up to 8W, but the CPU was quickly throttled from 1.7 GHz to just 800 Mhz as the system tried to keep everything at 4W or below, this explained why the Nexus 10 didn't benchmark as well as we wished.
Back to the 5450 which should beat the A6X, trouble is it has double the CPU & GPU cores of the 5250 and is clocked higher, even on a more advanced 28nm process, which will lower power consumption I feel that system will often be throttled because of power and maybe heat concerns, so it looks amazing on paper but may disappoint in reality, and a 5450 in smartphone is going to suffer even more.
So why does Apple have an advantage?, well basically money, for a start iSheep will pay more for their devices, so they afford to design a big SoC and big batteries that may not be profitable to other companies. Tegra 4 is listed as a 80mm2 chip, iPhone 5 is 96mm2 and A6X is 123mm2, Apple can pack more transistors and reap the GPU performance lead, also they chosen graphics supplier Imagination Technologies have excellent products, Power VR Rogue will only increase Apple's GPU lead. They now have their own chip design team, the benefit for them has been their Swift core is almost as powerful as ARM A15, but seems less power hungry, anyway Apple seems to be happy running slower CPUs compared to Android. Until an Android or WP8 or somebody can achieve Apple's margins they will be able to 'buy' their way to GPU domination, as an Android fan it makes me sad:crying:
Click to expand...
Click to collapse
Well said mate!
I can understand what you feel, nowdays android players like samsung,nvidia are focusing more on CPU than GPU.
If they won't stop soon and continued to use this strategy they will fail.
GPU will become bottleneck and you will not be able use the cpu at its full potential. (Atleast when gaming)
i have Galaxy S2 exynos 4 1.2Ghz and 400mhz oc mali gpu
In my analysis most modern games like MC4,NFS:MW aren't running at 60FPS at all thats because GPU always have 100% workload and CPU is relaxing there by outputing 50-70% of total CPU workload
I know some games aren't optimize for all android devices as opposed to apple devices but still even high-end android devices has slower gpu (than ipad 4 atleast )
AFAIK, Galaxy SIV is likely to pack T-604 with some tweaks instead of mighty T-658 which is still slower than iPAddle 4
Turbotab said:
There was a great article on Anandtech that tested the power consumption of the Nexus 10's Exynos 5250 SoC, it showed that both the CPU and GPU had a TDP of 4W, making a theoretical SoC TDP of 8W. However when the GPU was being stressed by running a game, they ran a CPU benchmark in the background, the SoC quickly went up to 8W, but the CPU was quickly throttled from 1.7 GHz to just 800 Mhz as the system tried to keep everything at 4W or below, this explained why the Nexus 10 didn't benchmark as well as we wished.
Back to the 5450 which should beat the A6X, trouble is it has double the CPU & GPU cores of the 5250 and is clocked higher, even on a more advanced 28nm process, which will lower power consumption I feel that system will often be throttled because of power and maybe heat concerns, so it looks amazing on paper but may disappoint in reality, and a 5450 in smartphone is going to suffer even more.
So why does Apple have an advantage?, well basically money, for a start iSheep will pay more for their devices, so they afford to design a big SoC and big batteries that may not be profitable to other companies. Tegra 4 is listed as a 80mm2 chip, iPhone 5 is 96mm2 and A6X is 123mm2, Apple can pack more transistors and reap the GPU performance lead, also they chosen graphics supplier Imagination Technologies have excellent products, Power VR Rogue will only increase Apple's GPU lead. They now have their own chip design team, the benefit for them has been their Swift core is almost as powerful as ARM A15, but seems less power hungry, anyway Apple seems to be happy running slower CPUs compared to Android. Until an Android or WP8 or somebody can achieve Apple's margins they will be able to 'buy' their way to GPU domination, as an Android fan it makes me sad:crying:
Click to expand...
Click to collapse
Typical "isheep" reference, unnecessary.
Why does apple have the advantage? Maybe because there semiconductor team is talented and can tie the A6X+PowerVR GPU efficiently. NIVIDA should have focused more on GPU in my opinion as the CPU was already good enough. With these tablets pushing excess of 250+ppi the graphics processor will play a huge role. They put 72 cores in there processor. Excellent. Will the chip ever be optimized to full potential? No. So again they demonstrated a product that sounds good on paper but real world performance might be a different story.
MrPhilo said:
For goodness sake, this isn't final hardware, anything could change. Hung2900 knows nothing, what he stated isn't true. Samsung has licensed PowerVR, it isn't just stuck to Apple, just that Samsung prefers using ARMs GPU solution. Another thing I dislike is how everyone is comparing a GPU in the iPad 4 (SGX554MP4) that will NEVER arrive in a phone compared a Tegra 4 which will arrive in a phone. If you check OP link the benchmark was posted on the 3rd of January with different results (18fps then 33fps), so there is a chance it'll rival the iPad 4. I love Tegra as Nvidia is pushing developers to make more better games for Android compared to the 'geeks' *cough* who prefers benchmark results, whats the point of having a powerful GPU if the OEM isn't pushing developers to create enhance effect games for there chip.
Hamdir is correct about the GPUs, if Tegra 3 was around 50-80% faster than Tegra 2 with just 4 more cores, I can't really imagine it only being 2x faster than Tegra 3. Plus its a 28nm (at around 80mm2 just a bit bigger than Tegra 3, smaller than A6 90mm2) along with the dual memory than single on Tegra 2/3.
Click to expand...
Click to collapse
Firstly please keep it civil, don't go around saying that people know nothing, people's posts always speak volumes. Also calling people geeks, on XDA is that even an insult, next you're be asking what I deadlift:laugh:
My OP was done in the spirit of technical curiosity, and to counter the typical unrealistic expectations of a new product on mainstream sites, e.g. Nvidia will use Kepler tech (which was false), omg Kepler is like GTX 680, Tegra 4 will own the world, people forget that we are still talking about device that can only use a few watts, and must be passively cooled and not a 200+ watt, dual-fan GPU, even though they both now have to power similar resolutions, which is mental.
I both agree and disagree with your view on Nvidia's developer relationship, THD games do look nice, I compared Infinity Blade 2 on iOS vs Dead Trigger 2 on youtube, and Dead Trigger 2 just looked richer, more particle & physics effects, although IF Blade looked sharper at iPad 4 native resolution, one of the few titles to use the A6x's GPU fully.The downside to this relationship is the further fragmentation of the Android ecosystem, as Chainfire's app showed most of the extra effects can run on non Tegra devices.
Now, a 6 times increase in shader, does not automatically mean that games / benchmarks will scale in linear fashion, as other factors such as TMU /ROP throughput can bottleneck performance. Nvidia's Technical Marketing Manager, when interviewed at CES, said that the overall improvement in games / benchmarks will be around 3 to 4 times T3. Ultimately I hope to see Tegra 4 in a new Nexus 7, and if these benchmarks are proved accurate, it wouldn't stop me buying. Overall including the CPU, it would be a massive upgrade over the current N7, all in the space of a year.
At 50 seconds onwards.
https://www.youtube.com/watch?v=iC7A5AmTPi0
iOSecure said:
Typical "isheep" reference, unnecessary.
Why does apple have the advantage? Maybe because there semiconductor team is talented and can tie the A6X+PowerVR GPU efficiently. NIVIDA should have focused more on GPU in my opinion as the CPU was already good enough. With these tablets pushing excess of 250+ppi the graphics processor will play a huge role. They put 72 cores in there processor. Excellent. Will the chip ever be optimized to full potential? No. So again they demonstrated a product that sounds good on paper but real world performance might be a different story.
Click to expand...
Click to collapse
Sorry Steve, this is an Android forum, or where you too busy buffing the scratches out of your iPhone 5 to notice? I have full respect for the talents of Apple's engineers & marketing department, many of its users less so.
hamdir said:
tegra optimization porting no longer works using chainfire, this is now a myth
did u manage to try shadowgun thd, zombie driver or horn? the answer is no, games that use t3 sdk for physx and other cpu graphics works can not be forced to work on other devices, equally chainfire is now outdated and no longer updated
Click to expand...
Click to collapse
Looks like they haven't updated chain fire 3d for a while as a result only t3 games don't work but others do work rip tide gp, dead trigger etc . It's not a myth but it is outdated and only works with ics and tegra 2 compatible games . I think I (might be) unfortunate too but some gameloft games lagged on tegra device that i had, though root solved it too an extent
I am not saying something is superior to something just that my personal experience I might be wrong I may not be
Tbh I think benchmarks don't matter much unless you see some difference in real world usage and I had that problem with tegra in my experience
But we will have to see if the final version is able to push it above Mali t 604 and more importantly sgx544
Turbotab said:
Sorry Steve, this is an Android forum, or where you too busy buffing the scratches out of your iPhone 5 to notice? I have full respect for the talents of Apple's engineers & marketing department, many of its users less so.
Click to expand...
Click to collapse
No I actually own a nexus 4 and ipad mini so I'm pretty neutral in googles/apples ecosystems and not wiping any scratches off my devices.