Dual Core Snapdragon BLOWS NVidia Dual Core tegra 2 OUT OF THE WATER - General Topics

the new Dual Core Snapdragon makes Nvidia's Tegra 2 look like a single core CPU!
and it's not even out of development yet, so this review is on pre-release hardware (Mobile Development Platform (MDP)) which means it's not even optimized yet!
this is Massively Impressive!
some highlights
Qualcomm Mobile Development Platform (MDP)
SoC 1.5 GHz 45nm MSM8660
CPU Dual Core Snapdragon
GPU Adreno 220
RAM (?) LPDDR2
NAND 8 GB integrated, microSD slot
Cameras 13 MP Rear Facing with Autofocus and LED Flash, Front Facing (? MP)
Display 3.8" WVGA LCD-TFT with Capacitive Touch
Battery 3.3 Whr removable
OS Android 2.3.2 (Gingerbread)
...............................................................................................
the LG 3D
LG Optimus 3D is also a dual core cpu
Dual-core 1GHz ARM Cortex-A9 proccessor, PowerVR SGX540 GPU, TI OMAP4430 chipset
................................................................................................
the LG 2x
LG Optimus 2X is a Dual core cpu
Dual-core 1GHz ARM Cortex-A9 proccessor, ULP GeForce GPU, Tegra 2 chipset
................................................................................................
the Nexus S
Nexus S is a single core cpu
(single core) 1 GHz ARM Cortex-A8 processor, PowerVR SGX540
................................................................................................
GLBenchmark 2.0 Egypt
38 Qualcomm MDP
31 LG 3D
25 LG 2x
21 Nexus S
GLBenchmark 2.0 Pro
94 Qualcomm MDP
55 LG 3D
51 LG 2x
42 Nexus S
Quake 3 FPS (Frames per second)
80 Qualcomm MDP
50 LG 2x
52 Nexus S
N/A LG 3D
Quadrant / 3D / 2D
2851 / 1026 / 329 Qualcomm MDP
2670 / 1196 / 306 LG 2x
1636 / 588 / 309 Nexus S
N/A LG 3D
NOTE: take the Quadrant scores with a grain of Salt
heres what Anand has to say about it
"What all Quadrant is putting emphasis on with its 2D and 3D subtests is something of a mystery to me. There isn't a whole lot of documentation, but again it's become something of a standard. The 1.5 GHz MSM8660 leads in overall score and the 2D subtest, but trails Tegra 2 in the 3D subtest. If you notice the difference between Hummingbird (SGX540) from 2.1 to 2.3, you can see how Quadrant's strange 3D behavior on Android 2.3 seems to continually negatively impact performance. I saw the same odd missing texture and erratic performance back when I tested the Nexus S as I did on the MDP. Things like this and lack of updates are precisely why we need even better testing tools to effectively gauge performance"
Source: Anandtech.com
http://www.anandtech.com/show/4243/...ormance-1-5-ghz-msm8660-adreno-220-benchmarks
Hope u enjoyed this
Ric H. (a1yet)
PS: don't rule out Nvidia yet their dual core may have gotten blown out of the water BUT
will their quad (four) cores CPU AND 12 core Gpu be better ?
NVIDIA's Project Kal-El: Quad-Core A9s Coming to Smartphones/Tablets This Year
Link:
http://www.anandtech.com/show/4181/...re-a9s-coming-to-smartphonestablets-this-year

a1yet said:
PS: don't rule out Nvidia yet their dual core may have gotten blown out of the water BUT
will their 12 core cpu be better ?
Click to expand...
Click to collapse
If you're one of those benchmark nut-riders, at least take some time to understand what it is that you're reading. It's 12-core GPU, big difference from a 12-core CPU, which doesn't even exist on desktop computers yet (unless you're talking about multisocket server-class mobos), let alone on a mobile phone.
And the second point which 99% of the people who tend to lust at the benchmarks don't have a damn clue about, screen size and resolution. But I'm sure you don't care to know much about it, OP.

I don't see the point of benchmarks if they don't tell the real world stories.

not sure about if the information is accurate, however it will be nice to have competition so there is always better cpu coming out.

GREAT cause the ipad is killing tegra 2 already

I think mobile processors are similar to desktop processors. There's just too much going on to accurately benchmark. My OG Droid with a 1.25Ghz overclock doesn't even come close to touching my HTC Thunderbolt on stock, yet technically it's 250Mhz faster, right? The HTC's updated 1Ghz processor is faster than other 1Ghz processors, yet rated at 1Ghz. I don't see logic in all the hype.

lude219 said:
And the second point which 99% of the people who tend to lust at the benchmarks don't have a damn clue about, screen size and resolution. But I'm sure you don't care to know much about it, OP.
Click to expand...
Click to collapse
WELL my "PS:" was added in hast and I Made a typo. My whole post was about "GRAPHICS" performance so the typo did not impact the heart of my post!
sad day for you
because with your 2 brain cells u obviously have NO CLUE what u are talking about. "Screen SIZE" has no bearing on performance ! none, zero, zip, zilch!
talk to me about screen size next time I'm playing Angry Birds on my 52 inch HDTV!
the only thing that has ANY bearing on performance IS "resolution"
so to explain it in a way that u can understand
the only impact screen size has is it sometimes allows you (Depending on how the manufactures implement it) to have a higher ....
WAIT FOR IT ...........
WAIT FOR IT ...........
"Resolution"
WOW SAD Day for you !
Go bash someones post, who can tolerate your Ignorance! and leave mine alone
Sincerely
Ric H. (a1yet)

ngarcesp said:
GREAT cause the ipad is killing tegra 2 already
Click to expand...
Click to collapse
and the ipad 2's processor is made by samsung
Sent from HTC EVO

a1yet said:
WELL my "PS:" was added in hast and I Made a typo. My whole post was about "GRAPHICS" performance so the typo did not impact the heart of my post!
sad day for you
because with your 2 brain cells u obviously have NO CLUE what u are talking about. "Screen SIZE" has no bearing on performance ! none, zero, zip, zilch!
talk to me about screen size next time I'm playing Angry Birds on my 52 inch HDTV!
the only thing that has ANY bearing on performance IS "resolution"
so to explain it in a way that u can understand
the only impact screen size has is it sometimes allows you (Depending on how the manufactures implement it) to have a higher ....
WAIT FOR IT ...........
WAIT FOR IT ...........
"Resolution"
WOW SAD Day for you !
Go bash someones post, who can tolerate your Ignorance! and leave mine alone
Sincerely
Ric H. (a1yet)
Click to expand...
Click to collapse
I like you pal!That's the spirit!
Forget the haters dude,there are many around!
r916 said:
and the ipad 2's processor is made by samsung
Sent from HTC EVO
Click to expand...
Click to collapse
I don't know about it being made by Samsung,but the CPU(the CPU itself,not the whole chip)is larger than the other CPUs,thus having more space for more transistors.That significantly boosts performance.

Related

[INFO/Q] HTC Sensetion only 1900 points with

smartbench 2011 Productivity test
http://smartphonebenchmarks.com/ind...11:Productivity&filter_cpu=all&filter_gpu=all
gpu score i might understand why its low cos the high res but why the Productivity is so low ?
i guess HTC didnt put faster NAND ROM
Evo3D did 2000
someone maybe know what the problem or cause ?
Proz00 said:
smartbench 2011 Productivity test
http://smartphonebenchmarks.com/ind...11:Productivity&filter_cpu=all&filter_gpu=all
gpu score i might understand why its low cos the high res but why the Productivity is so low ?
i guess HTC didnt put faster NAND ROM
Evo3D did 2000
someone maybe know what the problem or cause ?
Click to expand...
Click to collapse
The reason is...
The CPU is cortex 8.
Tegra 2 and the new Samsung processors are Cortex 9.
Coretex 9 is a PRETTY big improvement over cortex.
Once again HTC is going for garbage hardware
What is in the sensation is 2 Desire HD CPUS oC to 1.2 Ghz + better GPU.
What is in the SGS2 is 2 MUCH better Hummingbird CPUs OC to 1.2 + MUCH better GPU
the cpu is neither a cortex a8 nor a cortex a9. it will provide plenty of performance and will be competitive with other dual cores.
the adreno 220 gpu that comes with the sensation is faster than the mali gpu that comes with the sgs2 when looking at preliminary tests done by anandtech.
whether it will be the fastest or slowest dual core soc will have to wait until its released, and benchmarks often only tell part of the story. but certainly it will provide far more performance than any of the single core soc's we have right now and will provide much satisfaction from its owners.
kaiserkannon said:
the cpu is neither a cortex a8 nor a cortex a9. it will provide plenty of performance and will be competitive with other dual cores.
the adreno 220 gpu that comes with the sensation is faster than the mali gpu that comes with the sgs2 when looking at preliminary tests done by anandtech.
whether it will be the fastest or slowest dual core soc will have to wait until its released, and benchmarks often only tell part of the story. but certainly it will provide far more performance than any of the single core soc's we have right now and will provide much satisfaction from its owners.
Click to expand...
Click to collapse
Huh? I'm confused.
Is the cpu not based on arms cortex a8? Just a slightly modified version. It is identical to the Single core Snapdragon in the Desire HD.
The benchmarks so far don't make it seem too be as competitive as the Tegra 2 OR orion.
Samsung has said that the Mali 400 is MUCH faster then the current hummingbird GPU. Current benchmarks say that it is infact SLOWER...
I doubt samsung would release the Orion with a GPU SLOWER then its previous gen... that just makes no sense. If that is the case then Tegra might be king. If the Mali 400 IS much better tho, samsung will have the best SoC.
The CPU in the Sensation is ROUGHLY... 2.4 ghz. Compare that to the Desire HD stable OC of 1.8 ghz.
What is left to be seen is how much the CPU can be OC'd.
I would think that it would be less then 1.8 ghz each core. But thats yet tooo bee seen.
Regardless of what you think... the HTC sensation CPU will be slower then the competitions.
EDIT: Forgot to mention that the Sensation CPU should have the same battery life as the current single core Snapdragon... however it is pushing more pixels sooo..
Samsung should have mated its Orion to Hummingbird gpu. Hummingbird was great
Sent from my MB860 using XDA App
Maedhros said:
The benchmarks so far don't make it seem too be as competitive as the Tegra 2 OR orion.
Click to expand...
Click to collapse
Dunno where you got your information from, but it's very competitive with the Tegra 2. (8660 is the CDMA version of the Sensation's 8260). From these benchmarks, we also know that an overclock of at least 1.5GHz will be perfectly viable--the chip was designed for that anyhow.
Debating A8 vs A9 is a trivial matter, because it's a tiny fraction of the entire picture.
Wondering if cm7 can help the score
First, that Anandtech benchmark is not a good measuring stick. Anandtech benched the MDP that had the 8660 running at 1.5 GHz and 800x480 so the results are higher than what Sensation can achieve because Sensations runs at a lower clock and higher resolution.
Second, Qualcomm 8260/8660 is A8 Cortex. Tegra 2, OMAP4 and Exynos are A9 Cortex based. Claims that Qualcomm doesn't use the ARM architecture is a lie.
Never trust smartbench. Period.
GLbenchmark is more trustworthy.
Sent via psychic transmittion.
t-mizzle said:
First, that Anandtech benchmark is not a good measuring stick. Anandtech benched the MDP that had the 8660 running at 1.5 GHz and 800x480 so the results are higher than what Sensation can achieve because Sensations runs at a lower clock and higher resolution.
Second, Qualcomm 8260/8660 is A8 Cortex. Tegra 2, OMAP4 and Exynos are A9 Cortex based. Claims that Qualcomm doesn't use the ARM architecture is a lie.
Click to expand...
Click to collapse
The scorpion core in snapdragon socs use the arm v7 instruction set that both the a8 and a9 use, but it is not an a8 or an a9, it is qualcomms own design.
And personally I like comparing the different chips in these phones at the same resolution to see which chip has better performance on a level playing field. But yeah the sensation will have a bit worse performance thanks to higher resolution. Like the atrix vs optimus 2x. But to me the higher resolution is completely worth the hit in performance.
TeroZ said:
Never trust smartbench. Period.
Click to expand...
Click to collapse
Would you care to elaborate on this please?
GLbenchmark is more trustworthy.
Sent via psychic transmittion.
Click to expand...
Click to collapse
GLBench is a decent 3D benchmark app, but it is just that - it tests only the GPU. Smartbench was designed to test both CPU (inc. dual-core ones) and GPU, hence reporting two numbers. IMO, you are not comparing apples to apples unless you were only referring to the GPU portion of the test.
kaiserkannon said:
The scorpion core in snapdragon socs use the arm v7 instruction set that both the a8 and a9 use, but it is not an a8 or an a9, it is qualcomms own design.
And personally I like comparing the different chips in these phones at the same resolution to see which chip has better performance on a level playing field. But yeah the sensation will have a bit worse performance thanks to higher resolution. Like the atrix vs optimus 2x. But to me the higher resolution is completely worth the hit in performance.
Click to expand...
Click to collapse
Stop spreading FUD. MSM 8260/8660 is not capable of out of order execution. Cortex A9 supports this feature, A8 does not.
MSM 8260/8660 Pipeline Depth is 13 stages, therefor it's clearly a A8 Cortex.
A9 was a successor to the A8 and it's a significant improvement over it.
t-mizzle said:
Stop spreading FUD. MSM 8260/8660 is not capable of out of order execution. Cortex A9 supports this feature, A8 does not.
MSM 8260/8660 Pipeline Depth is 13 stages, therefor it's clearly a A8 Cortex.
A9 was a successor to the A8 and it's a significant improvement over it.
Click to expand...
Click to collapse
qualcomm disagrees with you though. they state that it is not based on the a8 and has partial out of order execution. it also has a 128 bit wide neon data path for neon instructions in comparison to the 64 bit wide path in a8 and a9 designs. while there are some similarities to the a8 as you pointed out, the scorpion is not qualcomm's implementation of an a8. and it has some advantages over both a8 and a9. and some disadvantes to an a9. overall the a9 will probably be a bit faster clock for clock, but the scorpion cores in the snapdragon dual cores are clocked faster.
this is very much the same as amd and intel. they both use the same instruction set (x86), but their processors are not the same. qualcomm simply licenses the instruction set (armv7) and builds its own processor. while other companies like nvidia, TI, and samsung buy the cortex a8 or a9 design from ARM and build a copy of it.
Acei said:
Would you care to elaborate on this please?
GLBench is a decent 3D benchmark app, but it is just that - it tests only the GPU. Smartbench was designed to test both CPU (inc. dual-core ones) and GPU, hence reporting two numbers. IMO, you are not comparing apples to apples unless you were only referring to the GPU portion of the test.
Click to expand...
Click to collapse
You are right. But smartbench rank scorpion+adreno205 lower than DX with [email protected] is definitely nonsense.
For gpu, go glbenchmark or nenamark or an3dbench whatever but smartbench.
For cpu, crunching pi or linpack is more reliable.
Smartbench does not reflect any real world performance.
Sent via psychic transmittion.
Thracks said:
Dunno where you got your information from, but it's very competitive with the Tegra 2. (8660 is the CDMA version of the Sensation's 8260). From these benchmarks, we also know that an overclock of at least 1.5GHz will be perfectly viable--the chip was designed for that anyhow.
Debating A8 vs A9 is a trivial matter, because it's a tiny fraction of the entire picture.
Click to expand...
Click to collapse
Based on glbenchmark score the anand tests might be suspect. It was score 6% higher than tegra 2 not double like anand's test. Or qcomm might be monkeying with things.If that is the case I am going to have a big problem with qcomm products.
Maybe smartbench is right and the nand quality is poor?
The sense experience on it wasn't done. It would have to score higher than the mytouch and previous devices its dual core. Most likely a crappy engineering build on it.
Sent from my HTC Glacier using XDA Premium App
TeroZ said:
You are right. But smartbench rank scorpion+adreno205 lower than DX with [email protected] is definitely nonsense.
Click to expand...
Click to collapse
There are other benchmark apps that rank your combo in the same order as Smartbench in graphical tests. Plus, please do look at the productivity tests for Smartbench 2011 more carefully. Typical Scorpion based phone score slightly higher results on Scorpions than DX. Even games like Dungeon Defender (a graphically heavy game) ranks both as "mid-range", while ranking Galaxy S series as "high-end".
For gpu, go glbenchmark or nenamark or an3dbench whatever but smartbench.
For cpu, crunching pi or linpack is more reliable.
Smartbench does not reflect any real world performance.
Click to expand...
Click to collapse
Calculating Pi is a very very simple, narrow, and one-dimensioned test. Linpack is heavy on floating point calculations. If that is what you want to know, then I have no issues with that. But do your day-to-day tasks on your phones translate to pure floating point calculations on your phones? They don't. That's why I've included several tests and will be including more as new versions are updated in the future. Plus, I believe none of them uses more than 1 core.
I'm open to suggestions and criticisms - but please do provide more details.
Latest benchmarks made by a retail GSII which has an ORION Exynos talks by themselves
http://forum.xda-developers.com/showpost.php?p=13096662&postcount=383
Exynos at "only" 1.2Ghz is even better than adreno 220 SCORPION 1.5Ghz chip as it score 41 fps whereas the latter is scoring 38 fps in GLBenchmark EGYPT standard test
http://images.anandtech.com/graphs/graph4243/36161.png
http://nsa25.casimages.com/img/2011/04/21/110421112944690206.png
So the HTC Sensation which is underclocked to 1.2Ghz and have a bigger resolution will look like shayt, SGSII With Exynos will rule for a long long time...
touness69 said:
Latest benchmarks made by a retail GSII which has an ORION Exynos talks by themselves
http://forum.xda-developers.com/showpost.php?p=13096662&postcount=383
Exynos at "only" 1.2Ghz is even better than adreno 220 SCORPION 1.5Ghz chip as it score 41 fps whereas the latter is scoring 38 fps in GLBenchmark EGYPT standard test
http://images.anandtech.com/graphs/graph4243/36161.png
http://nsa25.casimages.com/img/2011/04/21/110421112944690206.png
So the HTC Sensation which is underclocked to 1.2Ghz and have a bigger resolution will look like shayt, SGSII With Exynos will rule for a long long time...
Click to expand...
Click to collapse
Thanks for this.
Looks like this is another HTC phone with a disappointing CPU & GPU

[Q]Mali-400 MP vs Adreno 220 vs ULP GeForce vs SGX 540?

In the market for an upgrade, currently have a HTC Desire running CM7 and it's great, no lag in everyday use, OC'd to 1.1ghz, the only thing is that the graphics aren't great due to the outdated Adreno 200 GPU. I get higher Quadrant figures than my gf's Galaxy S but my phone is a joke in the graphics test compared to the samsung.
So for me the real limitation is the GPU and I want the best available product this time
The phones I had in mind (and in order of preference) are:
Samsung Galaxy S II (Mali-400 MP) (480x800)
HTC Evo 3D (Adreno 220) (540x960)
LG Optimus 3D (SGX 540) (480x800)
Motorola Atrix 4G (ULP GeForce) (540x960)
Anandtech benchmark on OC'd adreno 220, sgx 540 & geforce
Anandtech benchmark on mali-400 mp
Based partly on the benchmarks and lots and lots of forum/thread searching, my guess is Mali-400 MP > SGX 540 > Adreno 220 > GeForce ULP, is that about right?
What do you guys think is the best and what should I go for?
Extra Q.. is the SGX 543 expected in any android devices yet? Google brings up nothing
nop... wrong
SGX 543 = Adreno 220 >> Mali -400 >> Geforce ULP [tegra 2] >> SGX 540 >> Adreno 205 >> SGX 535 >> Adreno 200 = SGX 530
m nt sure but as per my knowledge Adreno 220 is more powerful which is going to release in Evo 3d and Sensation...
I m planning to buy Htc sensation soon....
EDIT :- Correction of mali-400 position..... [and even i am not sure about adreno 220 ]
[even my xperia runs on adreno 200 ..... cravin for good gpu too]
neelpatel007 said:
Adreno 220 >> Geforce ULP [tegra 2] >> SGX 540 = Mali -400 >> Adreno 205 >> SGX 535 >> Adreno 200 = SGX 530
m nt sure but as per my knowledge Adreno 220 is more powerful which is going to release in Evo 3d and Sensation...
I m planning to buy Htc sensation soon....
[even my xperia runs on adreno 200 ..... cravin for good gpu too]
Click to expand...
Click to collapse
Yep,that's about right.I have a Desire HD and am going to buy the Sensation when it hits the market.
One a couple of things though.First,he SGX 540 in the LG Optimus 2X is an overclocked version of the one in the Galaxy S.Second,by looking at these benchmarks it's unfair for the Geforce and the Adreno 220,because the Atrix and Evo 3D/Sensation have a resolution of 960x540 as opposed to 800x480 of the others,as is written in the OP.That is 35% more pixels which makes things harder for the GPU.
sorry for the question but what does gpu do?
Just said:
sorry for the question but what does gpu do?
Click to expand...
Click to collapse
Accelerates graphics and lets you play games smoothly (Graphics Processing Unit). Unless you're an avid mobile gamer, any of the newer ones are just fine.
But why would Samsung switch from the SGX540 used in the Galaxy S to the Mali-400 MP in Galaxy S2 if it wasn't better. I know comparing qHD screens to the WVGA screens is slightly unfair but I've heard the Galaxy S2's S-AMOLED+ display is one of the best on the market.
In addition, why would Samsung use the Exynos SoC as opposed to the Tegra2 if it weren't as good
the optimus 3d beats the optimus 2x in the benchmarks so i think it's fair to say sgx540 is better than tegra/geforce
viva.fidel said:
But why would Samsung switch from the SGX540 used in the Galaxy S to the Mali-400 MP in Galaxy S2 if it wasn't better. I know comparing qHD screens to the WVGA screens is slightly unfair but I've heard the Galaxy S2's S-AMOLED+ display is one of the best on the market.
In addition, why would Samsung use the Exynos SoC as opposed to the Tegra2 if it weren't as good
the optimus 3d beats the optimus 2x in the benchmarks so i think it's fair to say sgx540 is better than tegra/geforce
Click to expand...
Click to collapse
That's exactly my point,Samsung didn't change to a worse GPU.The Optimus 3D uses an overclocked version of the SGX540.I think that,if they were running at the same frequencies,the Mali would outperform it,but we can't be sure.
Bottom line?If you want hardcore gaming,go for either Adreno 220 or Geforce ULP.The others are just fine.
Remember benchmarks arent everything, and samsung is uping to 1.2ghz each core. Anyways I like htc so ill stick to mytouches and htc so mytouch 4g slide for me
http://pocketnow.com/android/t-mobile-mytouch-4g-slide-htc-doubleshot-full-specs
Sent from my HTC Glacier using Tapatalk
tolis626 said:
That's exactly my point,Samsung didn't change to a worse GPU.The Optimus 3D uses an overclocked version of the SGX540.I think that,if they were running at the same frequencies,the Mali would outperform it,but we can't be sure.
Bottom line?If you want hardcore gaming,go for either Adreno 220 or Geforce ULP.The others are just fine.
Click to expand...
Click to collapse
Hey, yeah I looked into this and the Optimus 3D uses SGX540 @ 300mhz with an updated driver compare to the SGS @ 200 mhz
Guess I'm just gonna have to wait until a few more benchmarks emerge of the Mali as anandtech's were using a prototype device.
Wow, They're fast. The 220 hitting 90fps when my Z only gets 25
viva.fidel said:
Hey, yeah I looked into this and the Optimus 3D uses SGX540 @ 300mhz with an updated driver compare to the SGS @ 200 mhz
Guess I'm just gonna have to wait until a few more benchmarks emerge of the Mali as anandtech's were using a prototype device.
Click to expand...
Click to collapse
See?Told 'ya!
Anyway,I tend to believe that the new generations strongest beast(They are all beasts after all)is the Adreno 220.Couple it with the fact that the GPU gets overclocked together with the CPU(Integrated),if you overclock it at a healthy frequency(1.8GHz anyone? Yeah,I'm THAT mad!)you will get dreamy performance.
Anyway,the Sensation's benchmarks are just from prototypes,so we have yet to see its true potential.
Sure thing is they are all very damn fast,and everything will be even faster when multi-core optimized apps hit the market.Until then we can only sit back and enjoy the show.I for one am torn between the Galaxy S 2 and the Sensation.Which one I'll get?Dunno yet.If it wasn't for the Super Amoled + screen I wouldn't even be thinking about it,but the screen on those things is damn beautiful.On the other hand,the Sensation has qHD screen...Why,oh why are they so expensive?I'd get both otherwise!
tolis626 said:
See?Told 'ya!
Anyway,I tend to believe that the new generations strongest beast(They are all beasts after all)is the Adreno 220.Couple it with the fact that the GPU gets overclocked together with the CPU(Integrated),if you overclock it at a healthy frequency(1.8GHz anyone? Yeah,I'm THAT mad!)you will get dreamy performance.
Anyway,the Sensation's benchmarks are just from prototypes,so we have yet to see its true potential.
Sure thing is they are all very damn fast,and everything will be even faster when multi-core optimized apps hit the market.Until then we can only sit back and enjoy the show.I for one am torn between the Galaxy S 2 and the Sensation.Which one I'll get?Dunno yet.If it wasn't for the Super Amoled + screen I wouldn't even be thinking about it,but the screen on those things is damn beautiful.On the other hand,the Sensation has qHD screen...Why,oh why are they so expensive?I'd get both otherwise!
Click to expand...
Click to collapse
As a DesireHD owner myself, im on the same boat as you my friend. My only gripe with the sgs2 is the non qhd res screen and comparatively weak gpu, especially how their last gen sgs gpu was miles ahead of everyone else. There's also the option to get the tegra2 powered sgs2, but i think that one does not have samoled+ display, also im not sure if the tegra2 unit will be OC'd to 1.2. Im seriously considering now the evo 3d gsm, but not only does it have no built in memory but its coming out later and sgs 2 is already out.
Benchmark: Mali 400 (Samsung S II) > Adreno 220 (HTC Sensation)
"Smartbench 2011 is one of the few synthetic benchmark tests that are able to measure multicore performance, so these results are for both CPU cores at work. The Samsung Galaxy S II steadily beats the HTC Pyramid in both the productivity (3732 vs 1898 points), and GPU tests (2431 vs 1426 points)."
But it also notes:
"Bear in mind that the Galaxy S II that the test was run on, seems to be the final version, with the 1.2GHz Exynos chipset, and the HTC Pyramid was the internal name of the HTC Sensation when it was still a prototype, so we’ll wait for the retail version to pass judgement on the 1.2Ghz Snapdragon chipset inside."
Mali seems to be doing pretty damn good.
http://www.glbenchmark.com/phonedet...2&testgroup=overall&benchmark=glpro20&var=top
And what matters is real life performance. See any vid of the galaxy s 2 and its buttery smooth, even while playing hd flash in the browser.
tolis626 said:
See?Told 'ya!
Anyway,I tend to believe that the new generations strongest beast(They are all beasts after all)is the Adreno 220.Couple it with the fact that the GPU gets overclocked together with the CPU(Integrated),if you overclock it at a healthy frequency(1.8GHz anyone? Yeah,I'm THAT mad!)you will get dreamy performance.
Anyway,the Sensation's benchmarks are just from prototypes,so we have yet to see its true potential.
Sure thing is they are all very damn fast,and everything will be even faster when multi-core optimized apps hit the market.Until then we can only sit back and enjoy the show.I for one am torn between the Galaxy S 2 and the Sensation.Which one I'll get?Dunno yet.If it wasn't for the Super Amoled + screen I wouldn't even be thinking about it,but the screen on those things is damn beautiful.On the other hand,the Sensation has qHD screen...Why,oh why are they so expensive?I'd get both otherwise!
Click to expand...
Click to collapse
The qHD display isn't really that much sharper than the Super Amoled Plus display, but it decreases performance.
And the Super Amoled also has: Better colors, totally black (also saves power), 1ms response (good when you watch movies), 180 degree, viewing angle, and responds better you your touches.
Did I mention that it was thin?
Also the Exynos Chip is stronger.
I belive that the CPU is customized by Intrisity, but I can't confirm that.
And the Mali-400MP will probably get better drivers. (not that it would lag anyway )
Galaxy S 2 is also thinner.
I think it's the best choice.
HTC Sensation is awesome, but Galaxy S 2 is the best phone on the market.
But on one final note: I recommend that you wait for the Nexus 3 that comes around Christmas.
It's rumored to have the Tegra 3 chip inside (quad-core).
tolis626 said:
On the other hand,the Sensation has qHD screen...
Click to expand...
Click to collapse
But PenTile Display...
Since I know this the Sensation was also one candidate for me along with the S2.
But now the S2 will be my next phone after my desire, I guess.
Hi
Even I changed my mind from HTC sensation to SGS2 .... after watching such a low benchmark performance of HTC sensation see it here
http://www.youtube.com/watch?v=6RHziztN2gs
And after watching the superb clearity of SuperAMOLED display , against HTC's blurry screen ,,,,,... m in love with SGS2......
See it here... http://www.youtube.com/user/SlashGear?ob=5#p/u/0/p3axSZX1R_s
I dont know what that HTC people did with Adreno 220 n i have no clue why it's benchmarks are low......
So all time Sensation lover has now moved to SGS2 .... SGS2 rocks....
See the FPS in the second video , SGS2 is far better in 2D n 3D performance too..... HTC s**ckin big time.... big disappointment....
mali 400 < sgx 540 < ulp < adreno 220
mali is best but problem is some games are only compatible with tegra's ULP
any pc modder knows resolution is a huge tax on graphics. benchmarks dont mean much at different resolutions, not to mention frame rate caps. mali might have a slight edge on adreno in a fair test, but not enough to swing a decision i dont think. screen and build quality will be the biggest variance. imo htc wins build quality, and higher res screen is better for text/browsing. Samsung's lower res (and crazy good looking) screen will always lead to higher frame rates, so they win for games/ vids. Ill be getting the sensation the day its released (work for tmob, decision was kinda made for me)
bobbymokie said:
any pc modder knows resolution is a huge tax on graphics. benchmarks dont mean much at different resolutions, not to mention frame rate caps. mali might have a slight edge on adreno in a fair test, but not enough to swing a decision i dont think. screen and build quality will be the biggest variance. imo htc wins build quality, and higher res screen is better for text/browsing. Samsung's lower res (and crazy good looking) screen will always lead to higher frame rates, so they win for games/ vids. Ill be getting the sensation the day its released (work for tmob, decision was kinda made for me)
Click to expand...
Click to collapse
can you give details on the mytouch 4g slide?
Sent from my HTC Glacier using Tapatalk

[INFO] Mali-400MP GPU vs Adreno 220 GPU

Mali-400 MP is a GPU (Graphics Processing Unit) developed by ARM in 2008. Mali-400 MP supports a wide range of use from mobile user interfaces to smartbooks, HDTV and mobile gaming. Adreno 220 is a GPU developed by Qualcomm in 2011 and it is a component of the MSM8260 / MSM8660 SoC (System-on-Chip) powering the upcoming HTC EVO 3D, HTC Pyramid and Palm’s TouchPad tablets.
Mali™-400 MP
Mali™-400 MP is the world’s first OpenGL ES 2.0 conformant multi-core GPU. It provides support for vector graphics through OpenVG 1.1 and 3D graphics through OpenGL ES 1.1 and 2.0, thus provides a complete graphics acceleration platform based on open standards. Mali-400 MP is scalable from 1 to 4 cores. It also provides the AMBA® AXI interface industry standard, which makes the integration of Mali-400 MP into SoC designs straight-forward. This also provides a well-defined interface for connecting Mali-400 MP to other bus architectures. Further, Mali-400 MP has a fully programmable architecture that provides high performance support for both shader-based and fixed-function graphics APIs. Mali-400 MP has a single driver stack for all multi-core configurations, which simplifies application porting, system integration and maintenance. Features provided by Mali-400 MP includes advanced tile-based deferred rendering and local buffering of intermediate pixel states that reduces memory bandwidth overhead and power consumption, efficient alpha blending of multiple layers in hardware and Full Scene Anti-Aliasing (FSAA) using rotated grid multi sampling that improves the graphics quality and performance.
Adreno 220
In 2011 Qualcomm introduced Adreno 220 GPU and it is a component of their MSM8260 /MSM8660 SoC. Adreno 220 supports console-quality 3D graphics and high-end effects such as vertex skinning, full-screen post-processing shader effects, dynamic lighting with full-screen alpha blending, real-time cloth simulation, advanced shader effects like dynamic shadows, god rays, bump mapping, reflections, etc and 3D animated textures. Adreno 220 GPU also claims that it can process 88 million triangles per second and offers twice the processing power of its predecessor Adreno 205. Further, Adreno 220 GPU claims to boost the performance up to a level that is competitive with console gaming systems. Also, Adreno 220 GPU will allow running games, UI, navigation apps and web browser in largest display sizes with lowest power levels.
Difference Between The Two
Difference between Mali-400MP GPU and Adreno 220 GPU
Based on a research done by Qualcomm using an average of Industry benchmarks composed of Neocore, GLBenchmark, 3DMM and Nenamark, they claim that Adreno 220 GPU in Qualcomm’s dual-core Snapdragon MSM8660 offers twice the performance of the GPU in other leading dual-core ARM9-based chips. Also, a team known as Anandtech has done several tests on Adreno 220 GPU. One of them was the GLBenchmark 2.0, which records the performance of OpenGL ES 2.0 compatible devices such as Mali™-400 MP using two long suites that include a combination of different effects such as direct lighting, bump, environment, radiance mapping, soft shadows, texture based on the use of vertex shader, deferred multi-pass rendering, texture noise, etc. and the test showed that Adreno 220 GPU was 2.2 times faster than the other existing devices such as Mali-400 MP GPU.
What do you guys think of this??.... I've been with HTC since they started doing Android... and I have to say android has come along way and so has the hardware....
thanks
thanks, i'be been looking for feedback regarding adreno 220 vs mali 400 GPUs, do you have any link or source to back this up
I am looking forward to buying the sensation, and my only concern is the adreno 220 GPU as and whether it is better, equal, or workse than the Mali 400
If it is better then i'm definitely buying the sensation, if not then i might consider the galaxy s II
Thanks
interesting read... the sgs2 fans will debate this and say that the benchmarks were done with an over clocked processor.. so as of now its all just a good read
i don't know if this video helps but the gpu upscaling and running on a larger screen with higher resolution and not falling back is kind of amazing
http://www.youtube.com/watch?v=RBBMVc9-fuk
boostedb16b said:
i don't know if this video helps but the gpu upscaling and running on a larger screen with higher resolution and not falling back is kind of amazing
http://www.youtube.com/watch?v=RBBMVc9-fuk
Click to expand...
Click to collapse
Nice post! I'm completely impressed. Amazing that that was being up converted from a friggin cell phone to large HDTV!
I'm sold!
another thing is that sony went with the adreno 220 in the xperia play so i guess it is as good as they say it is
boostedb16b said:
another thing is that sony went with the adreno 220 in the xperia play so i guess it is as good as they say it is
Click to expand...
Click to collapse
Actually they went with the Adreno 205 in the Xperia Play, which is the same gpu as in the Desire HD
Sent from my HTC Desire HD using XDA Premium App
boostedb16b said:
another thing is that sony went with the adreno 220 in the xperia play so i guess it is as good as they say it is
Click to expand...
Click to collapse
Isn't it Adreno 205 ?
http://www.gsmarena.com/sony_ericsson_xperia_play-3608.php
Adreno 220 is faster than Mali-400MP anyday, and compared to Tegra 2 it's better in some cases and worst in others. I didn't get a Galaxy S2 due to the fact that the Mali-400MP is soo antiquetted...
According to reports on the SGS 2 forum, the Mali 400 doesn't support textures? This seems a big blow on the SGS 2's gaming capabilities to me. What would be an interesting comparison would be the Adreno 220 vs the (LG Optimus 3D's) OMAP4 PowerVR SGX400 GPU, which is clocked at 300MHz instead of the 200MHz found in the samsung hummingbird processor....
Ian
Beaker491 said:
According to reports on the SGS 2 forum, the Mali 400 doesn't support textures? This seems a big blow on the SGS 2's gaming capabilities to me. What would be an interesting comparison would be the Adreno 220 vs the (LG Optimus 3D's) OMAP4 PowerVR SGX400 GPU, which is clocked at 300MHz instead of the 200MHz found in the samsung hummingbird processor....
Ian
Click to expand...
Click to collapse
Mali 400 Does support textures its texture compression which it does not support.
Elchemist said:
Mali 400 Does support textures its texture compression which it does not support.
Click to expand...
Click to collapse
It does support texture compression, just not the proprietary formats of other GPU vendors. Developers that only code for proprietary formats get locked in and lose out when something better comes along.
i surely was waiting for an idiot response like yours i9100 just face it that samsung made a mistake with the mali... and to be quite honest with you the sgs2 was rushed out to take the lead on the market b4 the sensation.. and the cpu was overclocked to be more competitive with the sensation... and the rush has shown what i expected lots of unknown problems in the phone and then returns for phones with faulty screens and so on so dont be a troll and state facts rather than just talk because you have the device
Mail 400 is just an outdated chip, why someone wouldn't include texture compression in a MOBILE gpu is beyond me. What the hell were they thinking?!! The PS3 only does this because it has a blu-ray sized storage media! Ridiculous.
As stated, it DOES support texture compression. There are just a few formats and it only supports one of those. It won't prove a problem as it seems the SGS2 is going to be extremely popular and all game devs will support it eventually (most do already)
boostedb16b said:
i surely was waiting for an idiot response like yours i9100 just face it that samsung made a mistake with the mali... and to be quite honest with you the sgs2 was rushed out to take the lead on the market b4 the sensation.. and the cpu was overclocked to be more competitive with the sensation... and the rush has shown what i expected lots of unknown problems in the phone and then returns for phones with faulty screens and so on so dont be a troll and state facts rather than just talk because you have the device
Click to expand...
Click to collapse
Wow, that was a slight over reaction! Do you work for HTC or something because that was a very defensive reply.
i9100 actually speaks some truth. There is a very informed topic over on the SGS2 forum about this which i suggest you guys read before making any more statements which are incorrect.
Still, looking forward to seeing this phone released to see if it's as impressive as the S2 is. I'm hoping it will be.
boostedb16b said:
i surely was waiting for an idiot response like yours i9100 just face it that samsung made a mistake with the mali... and to be quite honest with you the sgs2 was rushed out to take the lead on the market b4 the sensation.. and the cpu was overclocked to be more competitive with the sensation... and the rush has shown what i expected lots of unknown problems in the phone and then returns for phones with faulty screens and so on so dont be a troll and state facts rather than just talk because you have the device
Click to expand...
Click to collapse
I wouldn't be surprised if that was the case. To Samsung, that's business as usual.
I'm biased and this is not the Samsung home turf so I'll just keep to correcting facts. A little flaming is fine and not unexpected, but I won't go there. I'm enjoying my phone, and you should enjoy yours ;-)
In OpenGL ES 2.0 there is only one standard texture compression format - ETC. It's the only one you can rely on in all conformant GPUs. Others like ATITC, PVRTC and S3TC/DXTC are proprietary formats not suitable if you want your app to run on new devices.
at the moment since my hd2 has been retired i opted to buy a samsung galaxy s 4g worst mistake i have made in a long time... bought a sidekick 4g for my gf and that phone was another example of how poor quality control if for samsung or how they rush to get a head start in the market... the phone is riddled with problems
Currykiev said:
Wow, that was a slight over reaction! Do you work for HTC or something because that was a very defensive reply.
i9100 actually speaks some truth. There is a very informed topic over on the SGS2 forum about this which i suggest you guys read before making any more statements which are incorrect.
Still, looking forward to seeing this phone released to see if it's as impressive as the S2 is. I'm hoping it will be.
Click to expand...
Click to collapse
lol its not as bad as what i have read in the sgs2 forum where people are reporting their problems and trying to get legitimate help and are being bashed and called trolls because their phone honestly has a problem

TEGRA 4 - 1st possible GLBenchmark!!!!!!!! - READ ON

Who has been excited by the Tegra 4 rumours?, last night's Nvidia CES announcement was good, but what we really want are cold-hard BENCHMARKS.
I found an interesting mention of Tegra T114 SoC on a Linux Kernel site, which I've never heard of. I got really interested when it stated that the SoC is based on ARM A15 MP, it must be Tegra 4. I checked the background of the person who posted the kernel patch, he is a senior Nvidia Kernel engineer based in Finland.
https://lkml.org/lkml/2012/12/20/99
"This patchset adds initial support for the NVIDIA's new Tegra 114
SoC (T114) based on the ARM Cortex-A15 MP. It has the minimal support
to allow the kernel to boot up into shell console. This can be used as
a basis for adding other device drivers for this SoC. Currently there
are 2 evaluation boards available, "Dalmore" and "Pluto"."
On the off chance I decided to search www.glbenchmark.com for the 2 board names, Dalmore (a tasty whisky!) and Pluto (Planet, Greek God and cartoon dog!) Pluto returned nothing, but Dalmore returned a device called 'Dalmore Dalmore' that was posted on 3rd January 2013. However the OP had already deleted them, but thanks to Google Cache I found the results
RESULTS
GL_VENDOR NVIDIA Corporation
GL_VERSION OpenGL ES 2.0 17.01235
GL_RENDERER NVIDIA Tegra
From System spec, It runs Android 4.2.1, a Min frequency of 51 MHz and Max of 1836 Mhz
Nvidia DALMORE
GLBenchmark 2.5 Egypt HD C24Z16 - Offscreen (1080p) : 32.6 fps
iPad 4
GLBenchmark 2.5 Egypt HD C24Z16 - Offscreen (1080p): 49.6 fps
CONCLUSION
Anandtech has posted that Tegra 4 doesn't use unified shaders, so it's not based on Kepler. I reckon that if Nvidia had a brand new GPU they would have shouted about it at CES, the results I've found indicate that Tegra 4 is between 1 to 3 times faster than Tegra 3.
BUT, this is not 100% guaranteed to be a Tegra 4 system, but the evidence is strong that it is a T4 development board. If this is correct, we have to figure that it is running beta drivers, Nexus 10 is ~ 10% faster than the Arndale dev board with the same Exynos 5250 SoC. Even if Tegra 4 gets better drivers, it seems like the SGX 544 MP4 in the A6X is still the faster GPU, with Tegra 4 and Mali T604 being an almost equal 2nd. Nvidia has said that T4 is faster than A6X, but the devil is in the detail, in CPU benchmarks I can see that being true, but not for graphics.
UPDATE - Just to add to the feeling that that this legit, the GLBenchmark - System section lists the "android.os.Build.USER" as buildbrain. Buildbrain according to a Nvidia job posting is "Buildbrain is a mission-critical, multi-tier distributed computing system that performs mobile builds and automated tests each day, enabling NVIDIA's high performance development teams across the globe to develop and deliver NVIDIA's mobile product line"
http://jobsearch.naukri.com/job-lis...INEER-Nvidia-Corporation--2-to-4-130812500024
I posted the webcache links to GLBenchmark pages below, if they disappear from cache, I've saved a copy of the webpages, which I can upload, Enjoy
GL BENCHMARK - High Level
http://webcache.googleusercontent.c...p?D=Dalmore+Dalmore+&cd=1&hl=en&ct=clnk&gl=uk
GL BENCHMARK - Low Level
http://webcache.googleusercontent.c...e&testgroup=lowlevel&cd=1&hl=en&ct=clnk&gl=uk
GL BENCHMARK - GL CONFIG
http://webcache.googleusercontent.c...Dalmore&testgroup=gl&cd=1&hl=en&ct=clnk&gl=uk
GL BENCHMARK - EGL CONFIG
http://webcache.googleusercontent.c...almore&testgroup=egl&cd=1&hl=en&ct=clnk&gl=uk
GL BENCHMARK - SYSTEM
http://webcache.googleusercontent.c...ore&testgroup=system&cd=1&hl=en&ct=clnk&gl=uk
OFFSCREEN RESULTS
http://webcache.googleusercontent.c...enchmark.com+dalmore&cd=4&hl=en&ct=clnk&gl=uk
http://www.anandtech.com/show/6550/...00-5th-core-is-a15-28nm-hpm-ue-category-3-lte
Is there any Gpu that could outperform iPad4 before iPad5 comes out? adreno 320, t Mali 604 now tegra 4 aren't near it. Qualcomm won't release anything till q4 I guess, and tegra 4 has released too only thing that is left is I guess is t Mali 658 coming with exynos 5450 (doubtfully when it would release, not sure it will be better )
Looks like apple will hold the crown in future too .
i9100g user said:
Is there any Gpu that could outperform iPad4 before iPad5 comes out? adreno 320, t Mali 604 now tegra 4 aren't near it. Qualcomm won't release anything till q4 I guess, and tegra 4 has released too only thing that is left is I guess is t Mali 658 coming with exynos 5450 (doubtfully when it would release, not sure it will be better )
Looks like apple will hold the crown in future too .
Click to expand...
Click to collapse
There was a great article on Anandtech that tested the power consumption of the Nexus 10's Exynos 5250 SoC, it showed that both the CPU and GPU had a TDP of 4W, making a theoretical SoC TDP of 8W. However when the GPU was being stressed by running a game, they ran a CPU benchmark in the background, the SoC quickly went up to 8W, but the CPU was quickly throttled from 1.7 GHz to just 800 Mhz as the system tried to keep everything at 4W or below, this explained why the Nexus 10 didn't benchmark as well as we wished.
Back to the 5450 which should beat the A6X, trouble is it has double the CPU & GPU cores of the 5250 and is clocked higher, even on a more advanced 28nm process, which will lower power consumption I feel that system will often be throttled because of power and maybe heat concerns, so it looks amazing on paper but may disappoint in reality, and a 5450 in smartphone is going to suffer even more.
So why does Apple have an advantage?, well basically money, for a start mapple fans will pay more for their devices, so they afford to design a big SoC and big batteries that may not be profitable to other companies. Tegra 4 is listed as a 80mm2 chip, iPhone 5 is 96mm2 and A6X is 123mm2, Apple can pack more transistors and reap the GPU performance lead, also they chosen graphics supplier Imagination Technologies have excellent products, Power VR Rogue will only increase Apple's GPU lead. They now have their own chip design team, the benefit for them has been their Swift core is almost as powerful as ARM A15, but seems less power hungry, anyway Apple seems to be happy running slower CPUs compared to Android. Until an Android or WP8 or somebody can achieve Apple's margins they will be able to 'buy' their way to GPU domination, as an Android fan it makes me sad:crying:
32fps is no go...lets hope it's not final
hamdir said:
32fps is no go...lets hope it's not final
Click to expand...
Click to collapse
It needs to, but it will be OK for a new Nexus 7
still faster enough for me, I dont game alot on my nexus 7.
I know I'm taking about phones here ... But the iPhone 5 GPU and adreno 320 are very closely matched
Sent from my Nexus 4 using Tapatalk 2
italia0101 said:
I know I'm taking about phones here ... But the iPhone 5 GPU and adreno 320 are very closely matched
Sent from my Nexus 4 using Tapatalk 2
Click to expand...
Click to collapse
From what I remember the iPhone 5 and the new iPad wiped the floor with Nexus 4 and 10. The ST-Ericsson Nova A9600 is likely to have a PowerVR Rogue GPU. Just can't wait!!
adityak28 said:
From what I remember the iPhone 5 and the new iPad wiped the floor with Nexus 4 and 10. The ST-Ericsson Nova A9600 is likely to have a PowerVR Rogue GPU. Just can't wait!!
Click to expand...
Click to collapse
That isn't true , check glbenchmark , in the off screen test the iPhone scored 91 , the nexus 4 scored 88 ... That ksnt wiping my floors
Sent from my Nexus 10 using Tapatalk HD
Its interesting how even though nvidia chips arent the best we still get the best game graphics because of superior optimization through tegra zone. Not even the a6x is as fully optimized.
Sent from my SAMSUNG-SGH-I727 using xda premium
ian1 said:
Its interesting how even though nvidia chips arent the best we still get the best game graphics because of superior optimization through tegra zone. Not even the a6x is as fully optimized.
Sent from my SAMSUNG-SGH-I727 using xda premium
Click to expand...
Click to collapse
What sort of 'optimisation' do you mean? un optimised games lag that's a big letdown and tegra effects can also be used on other phones too with chain fire 3d I use it and tegra games work without lag with effects and I don't have a tegra device
With a tegra device I am restricted to optimised games mostly
The graphic performance of NVIDIA SoCs is always disappointed, sadly for the VGA dominanting provider on the world.
The first Tegra2, the GPU is a little bit better than SGX540 of GalaxyS a little bit in benchmark, but lacking NEON support.
The second one Tegra 3, the GPU is nearly the same as the old Mali400Mp4 in GALAXY S2/Original Note.
And now it's better but still nothing special and outperformed soon (Adreno 330 and next-gen Mali)
Strongest PowerVR GPUs are always the best, but sadly they are exclusive for Apple only (SGX543 and maybe SGX 554 also, only Sony ,who has the cross-licencing with Apple, has it in PS Vita and in PS Vita only)
tegra optimization porting no longer works using chainfire, this is now a myth
did u manage to try shadowgun thd, zombie driver or horn? the answer is no, games that use t3 sdk for physx and other cpu graphics works can not be forced to work on other devices, equally chainfire is now outdated and no longer updated
now about power vr they are only better in real multicore configuration which is only used by apple and Sony's vita, eating large die area, ie actual multicore each with its own subcores/shaders, if tegra was used in real multi core it would destroy all
finally this is really funny all this doom n gloom because of an early discarded development board benchmark, I dont mean to take away from turbo's thunder and his find but truly its ridiculous the amount of negativity its is collecting before any type of final device benchs
adrena 220 doubled in performance after the ICS update on sensation
t3 doubled the speed of t2 gpu with only 50% the number of shaders so how on earth do you believe only 2x the t3 scores with 600% more shaders!!
do you have any idea how miserable the ps3 performed in its early days? even new desktop GeForces perform much less than expected until the drivers are updated
enough with the FUD! seems this board is full of it nowadays and so little reasoning...
For goodness sake, this isn't final hardware, anything could change. Hung2900 knows nothing, what he stated isn't true. Samsung has licensed PowerVR, it isn't just stuck to Apple, just that Samsung prefers using ARMs GPU solution. Another thing I dislike is how everyone is comparing a GPU in the iPad 4 (SGX554MP4) that will NEVER arrive in a phone compared a Tegra 4 which will arrive in a phone. If you check OP link the benchmark was posted on the 3rd of January with different results (18fps then 33fps), so there is a chance it'll rival the iPad 4. I love Tegra as Nvidia is pushing developers to make more better games for Android compared to the 'geeks' *cough* who prefers benchmark results, whats the point of having a powerful GPU if the OEM isn't pushing developers to create enhance effect games for there chip.
Hamdir is correct about the GPUs, if Tegra 3 was around 50-80% faster than Tegra 2 with just 4 more cores, I can't really imagine it only being 2x faster than Tegra 3. Plus its a 28nm (at around 80mm2 just a bit bigger than Tegra 3, smaller than A6 90mm2) along with the dual memory than single on Tegra 2/3.
Turbotab said:
There was a great article on Anandtech that tested the power consumption of the Nexus 10's Exynos 5250 SoC, it showed that both the CPU and GPU had a TDP of 4W, making a theoretical SoC TDP of 8W. However when the GPU was being stressed by running a game, they ran a CPU benchmark in the background, the SoC quickly went up to 8W, but the CPU was quickly throttled from 1.7 GHz to just 800 Mhz as the system tried to keep everything at 4W or below, this explained why the Nexus 10 didn't benchmark as well as we wished.
Back to the 5450 which should beat the A6X, trouble is it has double the CPU & GPU cores of the 5250 and is clocked higher, even on a more advanced 28nm process, which will lower power consumption I feel that system will often be throttled because of power and maybe heat concerns, so it looks amazing on paper but may disappoint in reality, and a 5450 in smartphone is going to suffer even more.
So why does Apple have an advantage?, well basically money, for a start iSheep will pay more for their devices, so they afford to design a big SoC and big batteries that may not be profitable to other companies. Tegra 4 is listed as a 80mm2 chip, iPhone 5 is 96mm2 and A6X is 123mm2, Apple can pack more transistors and reap the GPU performance lead, also they chosen graphics supplier Imagination Technologies have excellent products, Power VR Rogue will only increase Apple's GPU lead. They now have their own chip design team, the benefit for them has been their Swift core is almost as powerful as ARM A15, but seems less power hungry, anyway Apple seems to be happy running slower CPUs compared to Android. Until an Android or WP8 or somebody can achieve Apple's margins they will be able to 'buy' their way to GPU domination, as an Android fan it makes me sad:crying:
Click to expand...
Click to collapse
Well said mate!
I can understand what you feel, nowdays android players like samsung,nvidia are focusing more on CPU than GPU.
If they won't stop soon and continued to use this strategy they will fail.
GPU will become bottleneck and you will not be able use the cpu at its full potential. (Atleast when gaming)
i have Galaxy S2 exynos 4 1.2Ghz and 400mhz oc mali gpu
In my analysis most modern games like MC4,NFS:MW aren't running at 60FPS at all thats because GPU always have 100% workload and CPU is relaxing there by outputing 50-70% of total CPU workload
I know some games aren't optimize for all android devices as opposed to apple devices but still even high-end android devices has slower gpu (than ipad 4 atleast )
AFAIK, Galaxy SIV is likely to pack T-604 with some tweaks instead of mighty T-658 which is still slower than iPAddle 4
Turbotab said:
There was a great article on Anandtech that tested the power consumption of the Nexus 10's Exynos 5250 SoC, it showed that both the CPU and GPU had a TDP of 4W, making a theoretical SoC TDP of 8W. However when the GPU was being stressed by running a game, they ran a CPU benchmark in the background, the SoC quickly went up to 8W, but the CPU was quickly throttled from 1.7 GHz to just 800 Mhz as the system tried to keep everything at 4W or below, this explained why the Nexus 10 didn't benchmark as well as we wished.
Back to the 5450 which should beat the A6X, trouble is it has double the CPU & GPU cores of the 5250 and is clocked higher, even on a more advanced 28nm process, which will lower power consumption I feel that system will often be throttled because of power and maybe heat concerns, so it looks amazing on paper but may disappoint in reality, and a 5450 in smartphone is going to suffer even more.
So why does Apple have an advantage?, well basically money, for a start iSheep will pay more for their devices, so they afford to design a big SoC and big batteries that may not be profitable to other companies. Tegra 4 is listed as a 80mm2 chip, iPhone 5 is 96mm2 and A6X is 123mm2, Apple can pack more transistors and reap the GPU performance lead, also they chosen graphics supplier Imagination Technologies have excellent products, Power VR Rogue will only increase Apple's GPU lead. They now have their own chip design team, the benefit for them has been their Swift core is almost as powerful as ARM A15, but seems less power hungry, anyway Apple seems to be happy running slower CPUs compared to Android. Until an Android or WP8 or somebody can achieve Apple's margins they will be able to 'buy' their way to GPU domination, as an Android fan it makes me sad:crying:
Click to expand...
Click to collapse
Typical "isheep" reference, unnecessary.
Why does apple have the advantage? Maybe because there semiconductor team is talented and can tie the A6X+PowerVR GPU efficiently. NIVIDA should have focused more on GPU in my opinion as the CPU was already good enough. With these tablets pushing excess of 250+ppi the graphics processor will play a huge role. They put 72 cores in there processor. Excellent. Will the chip ever be optimized to full potential? No. So again they demonstrated a product that sounds good on paper but real world performance might be a different story.
MrPhilo said:
For goodness sake, this isn't final hardware, anything could change. Hung2900 knows nothing, what he stated isn't true. Samsung has licensed PowerVR, it isn't just stuck to Apple, just that Samsung prefers using ARMs GPU solution. Another thing I dislike is how everyone is comparing a GPU in the iPad 4 (SGX554MP4) that will NEVER arrive in a phone compared a Tegra 4 which will arrive in a phone. If you check OP link the benchmark was posted on the 3rd of January with different results (18fps then 33fps), so there is a chance it'll rival the iPad 4. I love Tegra as Nvidia is pushing developers to make more better games for Android compared to the 'geeks' *cough* who prefers benchmark results, whats the point of having a powerful GPU if the OEM isn't pushing developers to create enhance effect games for there chip.
Hamdir is correct about the GPUs, if Tegra 3 was around 50-80% faster than Tegra 2 with just 4 more cores, I can't really imagine it only being 2x faster than Tegra 3. Plus its a 28nm (at around 80mm2 just a bit bigger than Tegra 3, smaller than A6 90mm2) along with the dual memory than single on Tegra 2/3.
Click to expand...
Click to collapse
Firstly please keep it civil, don't go around saying that people know nothing, people's posts always speak volumes. Also calling people geeks, on XDA is that even an insult, next you're be asking what I deadlift:laugh:
My OP was done in the spirit of technical curiosity, and to counter the typical unrealistic expectations of a new product on mainstream sites, e.g. Nvidia will use Kepler tech (which was false), omg Kepler is like GTX 680, Tegra 4 will own the world, people forget that we are still talking about device that can only use a few watts, and must be passively cooled and not a 200+ watt, dual-fan GPU, even though they both now have to power similar resolutions, which is mental.
I both agree and disagree with your view on Nvidia's developer relationship, THD games do look nice, I compared Infinity Blade 2 on iOS vs Dead Trigger 2 on youtube, and Dead Trigger 2 just looked richer, more particle & physics effects, although IF Blade looked sharper at iPad 4 native resolution, one of the few titles to use the A6x's GPU fully.The downside to this relationship is the further fragmentation of the Android ecosystem, as Chainfire's app showed most of the extra effects can run on non Tegra devices.
Now, a 6 times increase in shader, does not automatically mean that games / benchmarks will scale in linear fashion, as other factors such as TMU /ROP throughput can bottleneck performance. Nvidia's Technical Marketing Manager, when interviewed at CES, said that the overall improvement in games / benchmarks will be around 3 to 4 times T3. Ultimately I hope to see Tegra 4 in a new Nexus 7, and if these benchmarks are proved accurate, it wouldn't stop me buying. Overall including the CPU, it would be a massive upgrade over the current N7, all in the space of a year.
At 50 seconds onwards.
https://www.youtube.com/watch?v=iC7A5AmTPi0
iOSecure said:
Typical "isheep" reference, unnecessary.
Why does apple have the advantage? Maybe because there semiconductor team is talented and can tie the A6X+PowerVR GPU efficiently. NIVIDA should have focused more on GPU in my opinion as the CPU was already good enough. With these tablets pushing excess of 250+ppi the graphics processor will play a huge role. They put 72 cores in there processor. Excellent. Will the chip ever be optimized to full potential? No. So again they demonstrated a product that sounds good on paper but real world performance might be a different story.
Click to expand...
Click to collapse
Sorry Steve, this is an Android forum, or where you too busy buffing the scratches out of your iPhone 5 to notice? I have full respect for the talents of Apple's engineers & marketing department, many of its users less so.
hamdir said:
tegra optimization porting no longer works using chainfire, this is now a myth
did u manage to try shadowgun thd, zombie driver or horn? the answer is no, games that use t3 sdk for physx and other cpu graphics works can not be forced to work on other devices, equally chainfire is now outdated and no longer updated
Click to expand...
Click to collapse
Looks like they haven't updated chain fire 3d for a while as a result only t3 games don't work but others do work rip tide gp, dead trigger etc . It's not a myth but it is outdated and only works with ics and tegra 2 compatible games . I think I (might be) unfortunate too but some gameloft games lagged on tegra device that i had, though root solved it too an extent
I am not saying something is superior to something just that my personal experience I might be wrong I may not be
Tbh I think benchmarks don't matter much unless you see some difference in real world usage and I had that problem with tegra in my experience
But we will have to see if the final version is able to push it above Mali t 604 and more importantly sgx544
Turbotab said:
Sorry Steve, this is an Android forum, or where you too busy buffing the scratches out of your iPhone 5 to notice? I have full respect for the talents of Apple's engineers & marketing department, many of its users less so.
Click to expand...
Click to collapse
No I actually own a nexus 4 and ipad mini so I'm pretty neutral in googles/apples ecosystems and not wiping any scratches off my devices.

Adreno 418 gpu turn off

Why did they go with adreno gpu when nexus 6 that came out 1 year ago has adreno 420 already
Will you really be able to tell the diference? I doubt it. Its just a number game really
Sent from my GT-I9300 using Tapatalk
They had no choice. They could choose the 805, the 808, or the 810. If they chose the 805, everyone would complain that it's a processor from 2014. If they chose the 810, everyone would complain that it will overheat and get crappy battery life. The 808 is the best choice for the least number of complaints. Yeah, it has a slightly slower GPU than the 805, but the CPU is much faster than the 805, and even faster than the 810 in demanding situations because the 810 will completely turn off its BIG cores if it gets too warm, whereas the 808 doesn't get hot enough that it needs to turn off the BIG cores and switch to little.
Geordie Affy said:
Will you really be able to tell the diference? I doubt it. Its just a number game really
Sent from my GT-I9300 using Tapatalk
Click to expand...
Click to collapse
Cool story. If I use that logic my old lg g2 should be enough.
Sent from my LG-D800
gtg465x said:
They had no choice. They could choose the 805, the 808, or the 810. If they chose the 805, everyone would complain that it's a processor from 2014. If they chose the 810, everyone would complain that it will overheat and get crappy battery life. The 808 is the best choice for the least number of complaints. Yeah, it has a slightly slower GPU than the 805, but the CPU is much faster than the 805, and even faster than the 810 in demanding situations because the 810 will completely turn off its BIG cores if it gets too warm, whereas the 808 doesn't get hot enough that it needs to turn off the BIG cores and switch to little.
Click to expand...
Click to collapse
Yeah but it sucks that the whole android ecosystem has to depend on qualcomm. Imagine if next year they screw up again... It seems like samsung cpu rock this year and apple too..
Sent from my LG-D800
ambervals6 said:
Cool story. If I use that logic my old lg g2 should be enough.
Sent from my LG-D800
Click to expand...
Click to collapse
My point exactly lol. Whatever phone you buy it will be an upgrade in some way ... all this numbers game is becoming a tad OTT.
Sent from my GT-I9300 using Tapatalk
ambervals6 said:
Yeah but it sucks that the whole android ecosystem has to depend on qualcomm. Imagine if next year they screw up again... It seems like samsung cpu rock this year and apple too..
Sent from my LG-D800
Click to expand...
Click to collapse
Yep, it does suck. And it is a shame that Qualcomm could have made a great SOC instead of two meh ones. If they were smart, they would have put the Adreno 430 GPU in the 808 and marketed it as their flagship phone SOC, and marketed the 810 as a tablet only SOC, because tablets can better dissipate the heat. But none of that is Motorola's fault. I think Motorola chose wisely between the not so great choices they had.
Sent from my Nexus 6 using XDA Forums Pro.
gtg465x said:
Yep, it does suck. And it is a shame that Qualcomm could have made a great SOC instead of two meh ones. If they were smart, they would have put the Adreno 430 GPU in the 808 and marketed it as their flagship phone SOC, and marketed the 810 as a tablet only SOC, because tablets can better dissipate the heat. But none of that is Motorola's fault. I think Motorola chose wisely between the not so great choices they had.
Sent from my Nexus 6 using XDA Forums Pro.
Click to expand...
Click to collapse
Well as long as there is no serious competition out there, qualcomm will continue not to give a single **** and unfortunately upgrades will come in lame increments.
Sent from my LG-D800
I think it's funny how all the new 810 soc have the cores down clocked to 1.8ghz.
Sent from my HTC6525LVW using Tapatalk
ambervals6 said:
Well as long as there is no serious competition out there, qualcomm will continue not to give a single **** and unfortunately upgrades will come in lame increments.
Sent from my LG-D800
Click to expand...
Click to collapse
Competition is coming. Qualcomm should be worried. http://www.androidpolice.com/2015/0...qualcomm-begun-a-long-slow-fall-from-the-top/
gtg465x said:
They had no choice. They could choose the 805, the 808, or the 810. If they chose the 805, everyone would complain that it's a processor from 2014. If they chose the 810, everyone would complain that it will overheat and get crappy battery life. The 808 is the best choice for the least number of complaints. Yeah, it has a slightly slower GPU than the 805, but the CPU is much faster than the 805, and even faster than the 810 in demanding situations because the 810 will completely turn off its BIG cores if it gets too warm, whereas the 808 doesn't get hot enough that it needs to turn off the BIG cores and switch to little.
Click to expand...
Click to collapse
this isn't the full story + its a little misleading. here are the technical details:
the 418 is as good, if not better than the 420 for the following reasons:
1. The 418 has the same "system specs" as the 420, minus the down-throttling.
2. The 418 was fabbed on smaller architecture (20nm) vs. the 420 (28nm). This means greater power savings and less heat.
3. The 418/420 is to the 430 like the NVIDIA 960 is to the 980 GTX, but you wont get the 430 unless you get the 810.
Source: https://en.wikipedia.org/wiki/Adreno#Variants
640k said:
this isn't the full story + its a little misleading. here are the technical details:
the 418 is as good, if not better than the 420 for the following reasons:
1. The 418 has the same "system specs" as the 420, minus the down-throttling.
2. The 418 was fabbed on smaller architecture (20nm) vs. the 420 (28nm). This means greater power savings and less heat.
3. The 418/420 is to the 430 like the NVIDIA 960 is to the 980 GTX, but you wont get the 430 unless you get the 810.
Source: https://en.wikipedia.org/wiki/Adreno#Variants
Click to expand...
Click to collapse
I'm not sure I trust that Wikipedia article. There are no references cited for the 418 information. Looking at Anandtech, the Adreno 418 is slower in EVERY graphics benchmark than the Adreno 420, even though it has the advantage of being paired with a faster CPU.
Here's a quote from Anandtech: "In GFXBench, we can see that the Adreno 418 GPU is a definite step up from the Adreno 330 in the Snapdragon 801, but not quite at the level of the Snapdragon 805's Adreno 420."
Look at the benchmarks for yourself here. The Nexus 6 and Note 4 (SD 805 / Adreno 420) both beat the LG G4 (SD 808 / Adreno 418) in every single graphics and gaming test performed. http://www.anandtech.com/show/9379/the-lg-g4-review/7
So I think it's safe to say the 420 is a little better than the 418. I don't think they would have named it the 418 if it was just a die shrunk 420. Usually a die shrink allows for faster clock speeds, and if a die shrink was the only difference, you would expect the 418 to match the performance of the 420, or even surpass it because the clock speed could go higher. That isn't the case, so I think there are some architectural differences as well that aren't shown in the Wiki article. I think Qualcomm naming it the 418 instead of the 422 even though it's newer is a pretty good indication that Qualcomm knows it isn't as good as the 420.
gtg465x said:
I'm not sure I trust that Wikipedia article. There are no references cited for the 418 information. Looking at Anandtech, the Adreno 418 is slower in EVERY graphics benchmark than the Adreno 420, even though it has the advantage of being paired with a faster CPU.
Here's a quote from Anandtech: "In GFXBench, we can see that the Adreno 418 GPU is a definite step up from the Adreno 330 in the Snapdragon 801, but not quite at the level of the Snapdragon 805's Adreno 420."
Look at the benchmarks for yourself here. The Nexus 6 and Note 4 (SD 805 / Adreno 420) both beat the LG G4 (SD 808 / Adreno 418) in every single graphics and gaming test performed. http://www.anandtech.com/show/9379/the-lg-g4-review/7
So I think it's safe to say the 420 is a little better than the 418. I don't think they would have named it the 418 if it was just a die shrunk 420. Usually a die shrink allows for faster clock speeds, and if a die shrink was the only difference, you would expect the 418 to match the performance of the 420, or even surpass it because the clock speed could go higher. That isn't the case, so I think there are some architectural differences as well that aren't shown in the Wiki article. I think Qualcomm naming it the 418 instead of the 422 even though it's newer is a pretty good indication that Qualcomm knows it isn't as good as the 420.
Click to expand...
Click to collapse
Did anyone else notice how high the 2014 moto x was in those benchmarks. Motorola must really optimize the kernel.
Sent from my HTC6525LVW using Tapatalk
Positive spin time!
The 808's gpu handles games fine and consumes less power than the 7420's gpu (S6 & Note 5). I would rather have a GPU that handles games as is, rather than drains more battery and prefer a more power economical GPU for a portable device. There is a reason you see a lot of complaints about the S6 battery life and others do not. Most correlates to those that use apps that are GPU heavy.
rushless said:
Positive spin time!
The 808's gpu handles games fine and consumes less power than the 7420's gpu (S6 & Note 5). I would rather have a GPU that handles games as is, rather than drains more battery and prefer a more power economical GPU for a portable device. There is a reason you see a lot of complaints about the S6 battery life and others do not. Most correlates to those that use apps that are GPU heavy.
Click to expand...
Click to collapse
Lmao this guy
Sent from my A0001
What was comedic besides my awareness it is spin? True that games perform fine on the 808 and the 7420 gpu consumes more power. As far as bigger fancier games that need even more power, not very practical on a portable device so kind of moot with a small battery.
Sent from my SM-N910V using Tapatalk
If it's a concern you should wait for the nexus to drop with its rumored snapdragon 820 and next gen adreno.
Also for the issue of this year's qcom products sucking, remember that market pressure forced them to release chips with generic ARM cores because their in-house 64 bit designs weren't ready. The 820 ditches the octocore big.LITTLE architecture for a quad core qcom design. Lots to look forward to.
And I think the 808 is probably the best chip they could have picked for the X this year.
ambervals6 said:
Cool story. If I use that logic my old lg g2 should be enough.
Sent from my LG-D800
Click to expand...
Click to collapse
It is, but you're all too spoiled to make it out
SchmidtA99 said:
I think it's funny how all the new 810 soc have the cores down clocked to 1.8ghz.
Sent from my HTC6525LVW using Tapatalk
Click to expand...
Click to collapse
I think 810 is way better than 808. Adreno 430 vs 418. The 430 is WAY BETTER. And if the 810 gets too hot, you can always turn off 2 high performance cores. But you can never have an adreno 430 in the 808
gtg465x said:
I'm not sure I trust that Wikipedia article. There are no references cited for the 418 information. Looking at Anandtech, the Adreno 418 is slower in EVERY graphics benchmark than the Adreno 420, even though it has the advantage of being paired with a faster CPU.
Here's a quote from Anandtech: "In GFXBench, we can see that the Adreno 418 GPU is a definite step up from the Adreno 330 in the Snapdragon 801, but not quite at the level of the Snapdragon 805's Adreno 420."
Look at the benchmarks for yourself here. The Nexus 6 and Note 4 (SD 805 / Adreno 420) both beat the LG G4 (SD 808 / Adreno 418) in every single graphics and gaming test performed. http://www.anandtech.com/show/9379/the-lg-g4-review/7
So I think it's safe to say the 420 is a little better than the 418. I don't think they would have named it the 418 if it was just a die shrunk 420. Usually a die shrink allows for faster clock speeds, and if a die shrink was the only difference, you would expect the 418 to match the performance of the 420, or even surpass it because the clock speed could go higher. That isn't the case, so I think there are some architectural differences as well that aren't shown in the Wiki article. I think Qualcomm naming it the 418 instead of the 422 even though it's newer is a pretty good indication that Qualcomm knows it isn't as good as the 420.
Click to expand...
Click to collapse
It's slower because Qualcomm halved the memory bus from 128-bit to 64-bit. The S810/A430 has the same bandwidth as the S805 because they doubled the speed of the RAM. So, 128-bit LPDDR3-800 (1600MHz effective) is equal to LPDDR4-1600 (3200MHz effective): 25.6 GB/s
Unfortunately, Qualcomm limited the S808 to LPDDR3-933 (1866MHz effective): 14.9 GB/s
The 418 and 420 are the same GPU, architecturally. The 418 could probably be slightly faster in non-bandwidth limited scenarios (low resolution 3D).
Memory bandwidth dropped from 25.6 GB/s to 14.9 GB/s. That's nearly a 25% loss and about equal to the real world performance losses. Hence, it's a 418.

Categories

Resources