Why do Qualcomm still focus on increasing core count instead of IPC improvements? - General Questions and Answers

Hey guys.
Was recently reading up on Apple's new A9 processor and was slightly surprised to see it still used just 2 cores. However, the single-core performance of these bad boys is monstrous! And in my opinion, far more important than the multi-core scores when compared to 8-core processors from Qualcomm and Samsung. It's no secret that iOS generally does a lot of things much quicker (including generally better graphical performance) and also generally uses less power to do so than an Android device. I generally don't like Apple's products anymore, and this won't change anything. But they do things so darn well it's difficult to not give them credit where credit is due. The fact that the iPhone 6S has a battery size of about 1715mAH is insance when you think about Android phones with comparible battery lifes generally have batteries around 2500-2800mAH. That's a big difference. Is it largely down to iOS vs Android rather than the actual SoCs themselves? Or are Apple just amazing chip designers?
It reminds me of an AMD vs Intel debate, where AMD go for octa-core CPUs with super high clock speeds that tick all the boxes on paper, but then in reality are beaten out by even dual core offerings from Intel with lower or comparable clock speeds. Or when an 8MP camera on an iPhone beats out a 21MP camera on an Android phone. Time and time again manufacturers are more interested in the technical specs than the real life performance.
Any ideas?

Apple doesn't have to care about specs to success, and they had Jim Keller(designer of the Athlon 64 and the Apple Cyclone, and back at AMD designing Zen). Their IPC is impressive

Related

[Q] Snapdragon Vs Hummingbird

It seems like every review I read tells me the Hummingbird processor is much better and a more capable processor then the snapdragon. If this is true why hasn't HTC produced a phone with a Hummingbird CPU? HTC is way ahead in the android game experience wise but they haven't seemed to make any major hardware changes other then upping the rom and ram and speed of the cpu. Would someone please tell me that samsungs new CPU isn't as good as all the reviews say it is
I can't comment on the CPUs but I would just like to mention that costs, failure rate etc. all play a role in the decision about what CPU to use. There may be reasons for HTC not using the Hummingbird yet other then performance.
The Hummingbird is a little faster CPU-wise, which does surprise me as the Snapdragon has higher quoted instructions per clock. However, its GPU is really in a different league to that in the current Snapdragons and that's where the big difference lies in benchmarks (though that will, of course, only help GPU-accelerated things).
HTC don't necessarily use the chip that benchmarks the best on any given day - they have a relationship with Qualcomm and presumably get preferential rates from them, and they have an established platform. The good news for them is that Qualcomm are bringing out new Snapdragons with higher clockspeeds and better GPUs (and dual cores, though it remains to be seen if/when they arrive on phones) so they should equalise and quite probably turn the tables soon.
Competition is good - we were stuck with 5-600MHz XScales and ARM11s for ages, and now things are finally moving along.

Dual core processor?

Why would a phone need it? Wouldn't battery life just suck?
Sent from the key to my world.
Sure, if you want a portable console lol.
The response speed would be great thought, and camera will be able to record in full HD without trouble. But, the software will need to be programmed to take advantage of the dual-core processor.
As for the battery, not necessary. The cpu will throttle back its speed a lot, and a dual-core might be able to drop really low and remain fully operational which will require less battery. Also the new dual-core cpu nanometer architecture would most probably be lower which means better battery consumption but at full load (like when playing graphically intensive games) battery probably won't last long. Still thought, new battery technology will need to be manufactured soon to keep up with this new phone technology. Next you'll see are dual-gpu phones lol
I'm waiting for the 2011 CES to see if anything dual-core will be announced before dropping $800 on a phone as I would love such a device, just for fun.
CES is just next week right?
They've already announced one phone to run it, I just think technology is getting crazy with portability. My computer still has a 1.6ghz processor, these new phones will undoubtedly surpass my poor system. Ha.
Sent from the key to my world.
One thing that the makers of the chips take into consideration, is power usage. And it's easy to see that too. I'll use desktop cpus and laptop cpus for example. Intel and AMD's 6 core designed both have a TDP of under 125W. Old single core pentiums had a TDP higher than that, and were much bigger in nm range. Laptop cpus now only use at the most, 1/4 the tdp of a desktop cpu.(Not as fast though)
Other than that, right now I can bet that there is no multi-threaded apps available, and is Android really able to take advantage of a multi-core system? Probably not on it's own.
HAPPY NEW YEAR people!!
Yeah, CES is just next week. I know they announced some phone but I would like to know when they are coming so I know if I should buy the best thing right now or not.
I wouldn't have a clue if Android can handle multicore processors but maybe the new Honeycomb version of Android will enable this? If this is the case then maybe this phones will come March/April....sigh
And yeah, TDP of this chips will be lower then current chips. I bet they are working hard to make the best use of the battery.
ceg1792 said:
Why would a phone need it? Wouldn't battery life just suck?
Sent from the key to my world.
Click to expand...
Click to collapse
A multi core cpu does not necessarily use more power than a single core cpu; it's mostly dependent on the architecture.
NVIDIA talks about benefits of dual core:
http://www.engadget.com/2010/12/08/nvidia-touts-the-benefits-of-multi-core-processors-for-smartphon/
I think there is a definite need for Dual-Core Processors in phones. Gaming is making a mainstream shift from dedicated handheld gaming consoles to Smartphones. In order for developers to make more robust and graphically appealing games, they are going to need more processing power. Another point is that Dual-Core Processors will help browser rendering speeds. With HSPA+, WiMax, and LTE we are getting some serious downlink on our devices. But if you notice, a smartphone getting 3mbps down and one getting 10 mbps down renders a webpage at the same speed. Right now the processor bottlenecks webpage rendering, not our data connection. With these faster processors it helps eliminate the bottleneck to provide a gratifying web experience to the end-user.
It'll help if the application has multi-thread support. But if the app can only use 1 core/thread, then that's where dual core is useless. Also gaming isn't the main focus of Smartphones, there's probably a huge minority of people using their Smartphones as a serious gaming machine compared to people who are using their smartphone for work, talk, text, or other multimedia.

[Q] Most badass GPU and CPU in da world; Expert Knowedge please :)

I've been doing quite a bit of research on GPU's and CPU's in phone's/tablets lately. And I have a few unanswered questions that I can't seem to find an answer for.
1: What's the best chipset available for mobile phones and tablets right now? This link cleared quite a bit up for me, it does a fairly indepth comparison for both GPU and CPU performance between the Qualcomm S4, Tegra 3, OMAP 4470, and the Exynos 4212. And I dont want the 'Well this is better because it has more jiggahertz". Shut up, that's not what I need. I need something more indepth. If studies on individual GPU comparison can be provided, please drop a link. I'd like to know these things very well.
2: What individual GPU is currently the best? I realize the Ipad3 came out with with a graphics chip that's supposedly superior to the Xbox/PS3's. However I take anything Apple says with a grain of salt, they're notorious for shooting flaming BS out of their rear. However based on the little bit of searching I've done, the Adreno GPU's seem to be ahead of their time. I previously thought the Mali 400 GPU in the Exynos chipset was one of the best, but apparently it's outdated. Again, links to tests/studies/comparisons would be appreciated.
3: What's the deal with the ARM chips? Are the A5's, A6's, A11's, (and whatever other A chips out there are), some standard CPU developed by ARM and licensed out to all manufacturers to use in their chipsets?
4: What alternatives are there to the ARM CPU's? Most chipsets I research seem to be using a Cortex A9 chip.
5: What's the difference between the A5, A6, A9, etc. From what I've seen the higher numbers are the newer models, but I feel like that's a very shallow definition. If that is true, why does the newest iPad only use an A5x chip for it's quad core rather than an A9 or something of the sort.
6: Is the chipset in the iPad really the fastest out there? Personally, I can't really stand apple products; let alone the rabid fanboys and the obnoxious advertisements they put out. I can recognize that they very often gloat about their products and overexaggerate; like how they said the dual core in the iPhone 4s is the fastest out there, yet from what I've read the A5 is the worst performing dual core out there. Is the GPU in the tablet really superior to the Xbox? And is the processor really able to outdo the Tegra 3?
If you're able to answer any one of these, even exclusively, that would be appreciated. I just like knowledge
MultiLockOn said:
I've been doing quite a bit of research on GPU's and CPU's in phone's/tablets lately. And I have a few unanswered questions that I can't seem to find an answer for.
1: What's the best chipset available for mobile phones and tablets right now? This link cleared quite a bit up for me, it does a fairly indepth comparison for both GPU and CPU performance between the Qualcomm S4, Tegra 3, OMAP 4470, and the Exynos 4212. And I dont want the 'Well this is better because it has more jiggahertz". Shut up, that's not what I need. I need something more indepth. If studies on individual GPU comparison can be provided, please drop a link. I'd like to know these things very well.
2: What individual GPU is currently the best? I realize the Ipad3 came out with with a graphics chip that's supposedly superior to the Xbox/PS3's. However I take anything Apple says with a grain of salt, they're notorious for shooting flaming BS out of their rear. However based on the little bit of searching I've done, the Adreno GPU's seem to be ahead of their time. I previously thought the Mali 400 GPU in the Exynos chipset was one of the best, but apparently it's outdated. Again, links to tests/studies/comparisons would be appreciated.
3: What's the deal with the ARM chips? Are the A5's, A6's, A11's, (and whatever other A chips out there are), some standard CPU developed by ARM and licensed out to all manufacturers to use in their chipsets?
4: What alternatives are there to the ARM CPU's? Most chipsets I research seem to be using a Cortex A9 chip.
5: What's the difference between the A5, A6, A9, etc. From what I've seen the higher numbers are the newer models, but I feel like that's a very shallow definition. If that is true, why does the newest iPad only use an A5x chip for it's quad core rather than an A9 or something of the sort.
6: Is the chipset in the iPad really the fastest out there? Personally, I can't really stand apple products; let alone the rabid fanboys and the obnoxious advertisements they put out. I can recognize that they very often gloat about their products and overexaggerate; like how they said the dual core in the iPhone 4s is the fastest out there, yet from what I've read the A5 is the worst performing dual core out there. Is the GPU in the tablet really superior to the Xbox? And is the processor really able to outdo the Tegra 3?
If you're able to answer any one of these, even exclusively, that would be appreciated. I just like knowledge
Click to expand...
Click to collapse
1. Dunno right now, it's always changing. I hear the new Qualcomm processors with the new Andreno gpu are supposed to be the ****, but it's not out yet so who knows. The iPad 3 currently has not had any real world tests done yet, we need to wait for release. It is basically the same A5 chip as the iPad 2 but with the PSVita's gpu thrown in.
2. *sigh* The iPad 3 is not more powerful than an Xbox 360. It is better in I believe one aspect (more memory), but this has very little impact on performance/graphics quality. This is Apple shooting wads of **** out it's arse, or whoever made the claim. It's actually using the same GPU found in the PSVita, which we all know is not as powerful as a PS3/Xbox360. However, the PSVita is also using a quad core cpu, whereas the iPad 3 is using the same dual core A5 as the iPad 2, so technically the PSVita is superior. You also have to consider how many more pixels the gpu has to power on the iPad 3's display. While high res is nice, it takes more power to render it.
3. ARM creates a base chip for companies to slap their own GPU's and name on. The naming structure is pretty self explanatory.
4. All CPU's currently in tablets/cellphones are a variant of the ARM. A Cortex A9 is still an ARM chip. This will soon change when Intel releases their tablet/phone chips.
5. You're right, higher numbers do mean newer modeling. I don't know all the exacts, but with the newer ARM series you get higher and/or more efficient clocks, generally some battery savings, and in some series support for more cores. Apple's labeling of their chips has nothing to do with ARM's, it's their own naming scheme. The A5x is just what Apple calls their version of the ARM processor.
6. I believe atm the iPad 3 has the fastest chipset in a tablet..for now. It won't take long for it to be overtaken by other companies, there's so much in the works right now.
speedyink said:
1. Dunno right now, it's always changing. I hear the new Qualcomm processors with the new Andreno gpu are supposed to be the ****, but it's not out yet so who knows. The iPad 3 currently has not had any real world tests done yet, we need to wait for release. It is basically the same A5 chip as the iPad 2 but with the PSVita's gpu thrown in.
2. *sigh* The iPad 3 gpu is not more powerful than an Xbox 360. It is better in I believe one aspect (more memory), but this has very little impact on performance/graphics quality. This is Apple shooting wads of **** out it's arse, or whoever made the claim. It's actually using the same GPU found in the PSVita, which we all know is not as powerful as a PS3/Xbox360. However, the PSVita is also using a quad core cpu, whereas the iPad 3 is using the same dual core A5 as the iPad 2, so technically the PSVita is superior.
3. ARM creates a base chip for companies to slap their own GPU's and name on. The naming structure is pretty self explanatory.
4. All CPU's currently in tablets/cellphones are a variant of the ARM. A Cortex A9 is still an ARM chip. This will soon change when Intel releases their tablet/phone chips.
5. You're right, higher numbers do mean newer modeling. I don't know all the exacts, but with the newer ARM series you get higher and/or more efficient clocks, generally some battery savings, and in some series support for more cores. Apple's labeling of their chips has nothing to do with ARM's, it's their own naming scheme. The A5x is just what Apple calls their version of the ARM processor.
6. I believe atm the iPad 3 has the fastest chipset in a tablet..for now. It won't take long for it to be overtaken by other companies, there's so much in the works right now.
Click to expand...
Click to collapse
Thanks for the reply. It seems weird to me that Apple would rename a CPU to something as similar to one that would already exist, A5x as to A5.
MultiLockOn said:
Thanks for the reply. It seems weird to me that Apple would rename a CPU to something as similar to one that would already exist, A5x as to A5.
Click to expand...
Click to collapse
Because Apple is the type of company to step on someones feet like that, and then sue them later on for copyright infringement. Damn the confusion, Apple starts with A, so will their processors.
speedyink said:
Because Apple is the type of company to step on someones feet like that, and then sue them later on for copyright infringement. Damn the confusion, Apple starts with A, so will their processors.
Click to expand...
Click to collapse
yeah, apple just simply buy a technology and re-label them, make patent and troll others. so for comparison, apple doesn't count. Also these handheld chipset can't be compared with consoles, consoles have more proccessing power like more RAM bandwidth and polygons.
Anyway.. based on my experience, mali400 exynos has a butterly smooth performance for both UI and 3D graphics. I've tried both Gingerbread GNote and my SGS2.
on the other hand, Google did a great job with TI OMAP for it's Galaxy Nexus, pure HW accelerated 4.0.3.. with very little glitch, but I believe it's software issue.
IMO if you wanna buy a fast and smooth device, follow the current Nexus spec (at least similar) like GNexus, Motorola RAZR, etc. I've seen Tegra 3 4+1 Transformer Prime but never hands-on it. as far as i seen, UI and 3D performance are stunning. 1 extra core advantage is for low power mode when doing light proccessing and standby mode. Today hardwares are fast enough, drivers and OS optimisation are very important thing if you want everything run smoothly.
cmiiw, sorry for bad english
lesp4ul said:
yeah, apple just simply buy a technology and re-label them, make patent and troll others. so for comparison, apple doesn't count. Also these handheld chipset can't be compared with consoles, consoles have more proccessing power like more RAM bandwidth and polygons.
Anyway.. based on my experience, mali400 exynos has a butterly smooth performance for both UI and 3D graphics. I've tried both Gingerbread GNote and my SGS2.
on the other hand, Google did a great job with TI OMAP for it's Galaxy Nexus, pure HW accelerated 4.0.3.. with very little glitch, but I believe it's software issue.
IMO if you wanna buy a fast and smooth device, follow the current Nexus spec (at least similar) like GNexus, Motorola RAZR, etc. I've seen Tegra 3 4+1 Transformer Prime but never hands-on it. as far as i seen, UI and 3D performance are stunning. 1 extra core advantage is for low power mode when doing light proccessing and standby mode. Today hardwares are fast enough, drivers and OS optimisation are very important thing if you want everything run smoothly.
cmiiw, sorry for bad english
Click to expand...
Click to collapse
I kmow what you mean. Im extremely happy with my galaxy s2, I cant say I ever recall it lagging on me in any way whatsoever. Im not sure what makes the droid razr and galaxy nexus comparable to the s2. From what Ive read Omap processors tend to lag and consume battery, and the mali 400 is better than what either of those phones have. Id say its ICS but the razr still
Runs gingerbread
I was hoping for some more attention in here :/
I agree, omaps are battery hungry beast. Like my previous Optimus Black, man... i only got 12-14 hours with edge (1ghz UV smartass v2, also ****ty LG kernel haha). Same issue as my friend's Galaxy SL. I dunno if newer soc has a better behaviour.
Sent from my Nokia 6510 using PaperPlane™

TEGRA 4 - 1st possible GLBenchmark!!!!!!!! - READ ON

Who has been excited by the Tegra 4 rumours?, last night's Nvidia CES announcement was good, but what we really want are cold-hard BENCHMARKS.
I found an interesting mention of Tegra T114 SoC on a Linux Kernel site, which I've never heard of. I got really interested when it stated that the SoC is based on ARM A15 MP, it must be Tegra 4. I checked the background of the person who posted the kernel patch, he is a senior Nvidia Kernel engineer based in Finland.
https://lkml.org/lkml/2012/12/20/99
"This patchset adds initial support for the NVIDIA's new Tegra 114
SoC (T114) based on the ARM Cortex-A15 MP. It has the minimal support
to allow the kernel to boot up into shell console. This can be used as
a basis for adding other device drivers for this SoC. Currently there
are 2 evaluation boards available, "Dalmore" and "Pluto"."
On the off chance I decided to search www.glbenchmark.com for the 2 board names, Dalmore (a tasty whisky!) and Pluto (Planet, Greek God and cartoon dog!) Pluto returned nothing, but Dalmore returned a device called 'Dalmore Dalmore' that was posted on 3rd January 2013. However the OP had already deleted them, but thanks to Google Cache I found the results
RESULTS
GL_VENDOR NVIDIA Corporation
GL_VERSION OpenGL ES 2.0 17.01235
GL_RENDERER NVIDIA Tegra
From System spec, It runs Android 4.2.1, a Min frequency of 51 MHz and Max of 1836 Mhz
Nvidia DALMORE
GLBenchmark 2.5 Egypt HD C24Z16 - Offscreen (1080p) : 32.6 fps
iPad 4
GLBenchmark 2.5 Egypt HD C24Z16 - Offscreen (1080p): 49.6 fps
CONCLUSION
Anandtech has posted that Tegra 4 doesn't use unified shaders, so it's not based on Kepler. I reckon that if Nvidia had a brand new GPU they would have shouted about it at CES, the results I've found indicate that Tegra 4 is between 1 to 3 times faster than Tegra 3.
BUT, this is not 100% guaranteed to be a Tegra 4 system, but the evidence is strong that it is a T4 development board. If this is correct, we have to figure that it is running beta drivers, Nexus 10 is ~ 10% faster than the Arndale dev board with the same Exynos 5250 SoC. Even if Tegra 4 gets better drivers, it seems like the SGX 544 MP4 in the A6X is still the faster GPU, with Tegra 4 and Mali T604 being an almost equal 2nd. Nvidia has said that T4 is faster than A6X, but the devil is in the detail, in CPU benchmarks I can see that being true, but not for graphics.
UPDATE - Just to add to the feeling that that this legit, the GLBenchmark - System section lists the "android.os.Build.USER" as buildbrain. Buildbrain according to a Nvidia job posting is "Buildbrain is a mission-critical, multi-tier distributed computing system that performs mobile builds and automated tests each day, enabling NVIDIA's high performance development teams across the globe to develop and deliver NVIDIA's mobile product line"
http://jobsearch.naukri.com/job-lis...INEER-Nvidia-Corporation--2-to-4-130812500024
I posted the webcache links to GLBenchmark pages below, if they disappear from cache, I've saved a copy of the webpages, which I can upload, Enjoy
GL BENCHMARK - High Level
http://webcache.googleusercontent.c...p?D=Dalmore+Dalmore+&cd=1&hl=en&ct=clnk&gl=uk
GL BENCHMARK - Low Level
http://webcache.googleusercontent.c...e&testgroup=lowlevel&cd=1&hl=en&ct=clnk&gl=uk
GL BENCHMARK - GL CONFIG
http://webcache.googleusercontent.c...Dalmore&testgroup=gl&cd=1&hl=en&ct=clnk&gl=uk
GL BENCHMARK - EGL CONFIG
http://webcache.googleusercontent.c...almore&testgroup=egl&cd=1&hl=en&ct=clnk&gl=uk
GL BENCHMARK - SYSTEM
http://webcache.googleusercontent.c...ore&testgroup=system&cd=1&hl=en&ct=clnk&gl=uk
OFFSCREEN RESULTS
http://webcache.googleusercontent.c...enchmark.com+dalmore&cd=4&hl=en&ct=clnk&gl=uk
http://www.anandtech.com/show/6550/...00-5th-core-is-a15-28nm-hpm-ue-category-3-lte
Is there any Gpu that could outperform iPad4 before iPad5 comes out? adreno 320, t Mali 604 now tegra 4 aren't near it. Qualcomm won't release anything till q4 I guess, and tegra 4 has released too only thing that is left is I guess is t Mali 658 coming with exynos 5450 (doubtfully when it would release, not sure it will be better )
Looks like apple will hold the crown in future too .
i9100g user said:
Is there any Gpu that could outperform iPad4 before iPad5 comes out? adreno 320, t Mali 604 now tegra 4 aren't near it. Qualcomm won't release anything till q4 I guess, and tegra 4 has released too only thing that is left is I guess is t Mali 658 coming with exynos 5450 (doubtfully when it would release, not sure it will be better )
Looks like apple will hold the crown in future too .
Click to expand...
Click to collapse
There was a great article on Anandtech that tested the power consumption of the Nexus 10's Exynos 5250 SoC, it showed that both the CPU and GPU had a TDP of 4W, making a theoretical SoC TDP of 8W. However when the GPU was being stressed by running a game, they ran a CPU benchmark in the background, the SoC quickly went up to 8W, but the CPU was quickly throttled from 1.7 GHz to just 800 Mhz as the system tried to keep everything at 4W or below, this explained why the Nexus 10 didn't benchmark as well as we wished.
Back to the 5450 which should beat the A6X, trouble is it has double the CPU & GPU cores of the 5250 and is clocked higher, even on a more advanced 28nm process, which will lower power consumption I feel that system will often be throttled because of power and maybe heat concerns, so it looks amazing on paper but may disappoint in reality, and a 5450 in smartphone is going to suffer even more.
So why does Apple have an advantage?, well basically money, for a start mapple fans will pay more for their devices, so they afford to design a big SoC and big batteries that may not be profitable to other companies. Tegra 4 is listed as a 80mm2 chip, iPhone 5 is 96mm2 and A6X is 123mm2, Apple can pack more transistors and reap the GPU performance lead, also they chosen graphics supplier Imagination Technologies have excellent products, Power VR Rogue will only increase Apple's GPU lead. They now have their own chip design team, the benefit for them has been their Swift core is almost as powerful as ARM A15, but seems less power hungry, anyway Apple seems to be happy running slower CPUs compared to Android. Until an Android or WP8 or somebody can achieve Apple's margins they will be able to 'buy' their way to GPU domination, as an Android fan it makes me sad:crying:
32fps is no go...lets hope it's not final
hamdir said:
32fps is no go...lets hope it's not final
Click to expand...
Click to collapse
It needs to, but it will be OK for a new Nexus 7
still faster enough for me, I dont game alot on my nexus 7.
I know I'm taking about phones here ... But the iPhone 5 GPU and adreno 320 are very closely matched
Sent from my Nexus 4 using Tapatalk 2
italia0101 said:
I know I'm taking about phones here ... But the iPhone 5 GPU and adreno 320 are very closely matched
Sent from my Nexus 4 using Tapatalk 2
Click to expand...
Click to collapse
From what I remember the iPhone 5 and the new iPad wiped the floor with Nexus 4 and 10. The ST-Ericsson Nova A9600 is likely to have a PowerVR Rogue GPU. Just can't wait!!
adityak28 said:
From what I remember the iPhone 5 and the new iPad wiped the floor with Nexus 4 and 10. The ST-Ericsson Nova A9600 is likely to have a PowerVR Rogue GPU. Just can't wait!!
Click to expand...
Click to collapse
That isn't true , check glbenchmark , in the off screen test the iPhone scored 91 , the nexus 4 scored 88 ... That ksnt wiping my floors
Sent from my Nexus 10 using Tapatalk HD
Its interesting how even though nvidia chips arent the best we still get the best game graphics because of superior optimization through tegra zone. Not even the a6x is as fully optimized.
Sent from my SAMSUNG-SGH-I727 using xda premium
ian1 said:
Its interesting how even though nvidia chips arent the best we still get the best game graphics because of superior optimization through tegra zone. Not even the a6x is as fully optimized.
Sent from my SAMSUNG-SGH-I727 using xda premium
Click to expand...
Click to collapse
What sort of 'optimisation' do you mean? un optimised games lag that's a big letdown and tegra effects can also be used on other phones too with chain fire 3d I use it and tegra games work without lag with effects and I don't have a tegra device
With a tegra device I am restricted to optimised games mostly
The graphic performance of NVIDIA SoCs is always disappointed, sadly for the VGA dominanting provider on the world.
The first Tegra2, the GPU is a little bit better than SGX540 of GalaxyS a little bit in benchmark, but lacking NEON support.
The second one Tegra 3, the GPU is nearly the same as the old Mali400Mp4 in GALAXY S2/Original Note.
And now it's better but still nothing special and outperformed soon (Adreno 330 and next-gen Mali)
Strongest PowerVR GPUs are always the best, but sadly they are exclusive for Apple only (SGX543 and maybe SGX 554 also, only Sony ,who has the cross-licencing with Apple, has it in PS Vita and in PS Vita only)
tegra optimization porting no longer works using chainfire, this is now a myth
did u manage to try shadowgun thd, zombie driver or horn? the answer is no, games that use t3 sdk for physx and other cpu graphics works can not be forced to work on other devices, equally chainfire is now outdated and no longer updated
now about power vr they are only better in real multicore configuration which is only used by apple and Sony's vita, eating large die area, ie actual multicore each with its own subcores/shaders, if tegra was used in real multi core it would destroy all
finally this is really funny all this doom n gloom because of an early discarded development board benchmark, I dont mean to take away from turbo's thunder and his find but truly its ridiculous the amount of negativity its is collecting before any type of final device benchs
adrena 220 doubled in performance after the ICS update on sensation
t3 doubled the speed of t2 gpu with only 50% the number of shaders so how on earth do you believe only 2x the t3 scores with 600% more shaders!!
do you have any idea how miserable the ps3 performed in its early days? even new desktop GeForces perform much less than expected until the drivers are updated
enough with the FUD! seems this board is full of it nowadays and so little reasoning...
For goodness sake, this isn't final hardware, anything could change. Hung2900 knows nothing, what he stated isn't true. Samsung has licensed PowerVR, it isn't just stuck to Apple, just that Samsung prefers using ARMs GPU solution. Another thing I dislike is how everyone is comparing a GPU in the iPad 4 (SGX554MP4) that will NEVER arrive in a phone compared a Tegra 4 which will arrive in a phone. If you check OP link the benchmark was posted on the 3rd of January with different results (18fps then 33fps), so there is a chance it'll rival the iPad 4. I love Tegra as Nvidia is pushing developers to make more better games for Android compared to the 'geeks' *cough* who prefers benchmark results, whats the point of having a powerful GPU if the OEM isn't pushing developers to create enhance effect games for there chip.
Hamdir is correct about the GPUs, if Tegra 3 was around 50-80% faster than Tegra 2 with just 4 more cores, I can't really imagine it only being 2x faster than Tegra 3. Plus its a 28nm (at around 80mm2 just a bit bigger than Tegra 3, smaller than A6 90mm2) along with the dual memory than single on Tegra 2/3.
Turbotab said:
There was a great article on Anandtech that tested the power consumption of the Nexus 10's Exynos 5250 SoC, it showed that both the CPU and GPU had a TDP of 4W, making a theoretical SoC TDP of 8W. However when the GPU was being stressed by running a game, they ran a CPU benchmark in the background, the SoC quickly went up to 8W, but the CPU was quickly throttled from 1.7 GHz to just 800 Mhz as the system tried to keep everything at 4W or below, this explained why the Nexus 10 didn't benchmark as well as we wished.
Back to the 5450 which should beat the A6X, trouble is it has double the CPU & GPU cores of the 5250 and is clocked higher, even on a more advanced 28nm process, which will lower power consumption I feel that system will often be throttled because of power and maybe heat concerns, so it looks amazing on paper but may disappoint in reality, and a 5450 in smartphone is going to suffer even more.
So why does Apple have an advantage?, well basically money, for a start iSheep will pay more for their devices, so they afford to design a big SoC and big batteries that may not be profitable to other companies. Tegra 4 is listed as a 80mm2 chip, iPhone 5 is 96mm2 and A6X is 123mm2, Apple can pack more transistors and reap the GPU performance lead, also they chosen graphics supplier Imagination Technologies have excellent products, Power VR Rogue will only increase Apple's GPU lead. They now have their own chip design team, the benefit for them has been their Swift core is almost as powerful as ARM A15, but seems less power hungry, anyway Apple seems to be happy running slower CPUs compared to Android. Until an Android or WP8 or somebody can achieve Apple's margins they will be able to 'buy' their way to GPU domination, as an Android fan it makes me sad:crying:
Click to expand...
Click to collapse
Well said mate!
I can understand what you feel, nowdays android players like samsung,nvidia are focusing more on CPU than GPU.
If they won't stop soon and continued to use this strategy they will fail.
GPU will become bottleneck and you will not be able use the cpu at its full potential. (Atleast when gaming)
i have Galaxy S2 exynos 4 1.2Ghz and 400mhz oc mali gpu
In my analysis most modern games like MC4,NFS:MW aren't running at 60FPS at all thats because GPU always have 100% workload and CPU is relaxing there by outputing 50-70% of total CPU workload
I know some games aren't optimize for all android devices as opposed to apple devices but still even high-end android devices has slower gpu (than ipad 4 atleast )
AFAIK, Galaxy SIV is likely to pack T-604 with some tweaks instead of mighty T-658 which is still slower than iPAddle 4
Turbotab said:
There was a great article on Anandtech that tested the power consumption of the Nexus 10's Exynos 5250 SoC, it showed that both the CPU and GPU had a TDP of 4W, making a theoretical SoC TDP of 8W. However when the GPU was being stressed by running a game, they ran a CPU benchmark in the background, the SoC quickly went up to 8W, but the CPU was quickly throttled from 1.7 GHz to just 800 Mhz as the system tried to keep everything at 4W or below, this explained why the Nexus 10 didn't benchmark as well as we wished.
Back to the 5450 which should beat the A6X, trouble is it has double the CPU & GPU cores of the 5250 and is clocked higher, even on a more advanced 28nm process, which will lower power consumption I feel that system will often be throttled because of power and maybe heat concerns, so it looks amazing on paper but may disappoint in reality, and a 5450 in smartphone is going to suffer even more.
So why does Apple have an advantage?, well basically money, for a start iSheep will pay more for their devices, so they afford to design a big SoC and big batteries that may not be profitable to other companies. Tegra 4 is listed as a 80mm2 chip, iPhone 5 is 96mm2 and A6X is 123mm2, Apple can pack more transistors and reap the GPU performance lead, also they chosen graphics supplier Imagination Technologies have excellent products, Power VR Rogue will only increase Apple's GPU lead. They now have their own chip design team, the benefit for them has been their Swift core is almost as powerful as ARM A15, but seems less power hungry, anyway Apple seems to be happy running slower CPUs compared to Android. Until an Android or WP8 or somebody can achieve Apple's margins they will be able to 'buy' their way to GPU domination, as an Android fan it makes me sad:crying:
Click to expand...
Click to collapse
Typical "isheep" reference, unnecessary.
Why does apple have the advantage? Maybe because there semiconductor team is talented and can tie the A6X+PowerVR GPU efficiently. NIVIDA should have focused more on GPU in my opinion as the CPU was already good enough. With these tablets pushing excess of 250+ppi the graphics processor will play a huge role. They put 72 cores in there processor. Excellent. Will the chip ever be optimized to full potential? No. So again they demonstrated a product that sounds good on paper but real world performance might be a different story.
MrPhilo said:
For goodness sake, this isn't final hardware, anything could change. Hung2900 knows nothing, what he stated isn't true. Samsung has licensed PowerVR, it isn't just stuck to Apple, just that Samsung prefers using ARMs GPU solution. Another thing I dislike is how everyone is comparing a GPU in the iPad 4 (SGX554MP4) that will NEVER arrive in a phone compared a Tegra 4 which will arrive in a phone. If you check OP link the benchmark was posted on the 3rd of January with different results (18fps then 33fps), so there is a chance it'll rival the iPad 4. I love Tegra as Nvidia is pushing developers to make more better games for Android compared to the 'geeks' *cough* who prefers benchmark results, whats the point of having a powerful GPU if the OEM isn't pushing developers to create enhance effect games for there chip.
Hamdir is correct about the GPUs, if Tegra 3 was around 50-80% faster than Tegra 2 with just 4 more cores, I can't really imagine it only being 2x faster than Tegra 3. Plus its a 28nm (at around 80mm2 just a bit bigger than Tegra 3, smaller than A6 90mm2) along with the dual memory than single on Tegra 2/3.
Click to expand...
Click to collapse
Firstly please keep it civil, don't go around saying that people know nothing, people's posts always speak volumes. Also calling people geeks, on XDA is that even an insult, next you're be asking what I deadlift:laugh:
My OP was done in the spirit of technical curiosity, and to counter the typical unrealistic expectations of a new product on mainstream sites, e.g. Nvidia will use Kepler tech (which was false), omg Kepler is like GTX 680, Tegra 4 will own the world, people forget that we are still talking about device that can only use a few watts, and must be passively cooled and not a 200+ watt, dual-fan GPU, even though they both now have to power similar resolutions, which is mental.
I both agree and disagree with your view on Nvidia's developer relationship, THD games do look nice, I compared Infinity Blade 2 on iOS vs Dead Trigger 2 on youtube, and Dead Trigger 2 just looked richer, more particle & physics effects, although IF Blade looked sharper at iPad 4 native resolution, one of the few titles to use the A6x's GPU fully.The downside to this relationship is the further fragmentation of the Android ecosystem, as Chainfire's app showed most of the extra effects can run on non Tegra devices.
Now, a 6 times increase in shader, does not automatically mean that games / benchmarks will scale in linear fashion, as other factors such as TMU /ROP throughput can bottleneck performance. Nvidia's Technical Marketing Manager, when interviewed at CES, said that the overall improvement in games / benchmarks will be around 3 to 4 times T3. Ultimately I hope to see Tegra 4 in a new Nexus 7, and if these benchmarks are proved accurate, it wouldn't stop me buying. Overall including the CPU, it would be a massive upgrade over the current N7, all in the space of a year.
At 50 seconds onwards.
https://www.youtube.com/watch?v=iC7A5AmTPi0
iOSecure said:
Typical "isheep" reference, unnecessary.
Why does apple have the advantage? Maybe because there semiconductor team is talented and can tie the A6X+PowerVR GPU efficiently. NIVIDA should have focused more on GPU in my opinion as the CPU was already good enough. With these tablets pushing excess of 250+ppi the graphics processor will play a huge role. They put 72 cores in there processor. Excellent. Will the chip ever be optimized to full potential? No. So again they demonstrated a product that sounds good on paper but real world performance might be a different story.
Click to expand...
Click to collapse
Sorry Steve, this is an Android forum, or where you too busy buffing the scratches out of your iPhone 5 to notice? I have full respect for the talents of Apple's engineers & marketing department, many of its users less so.
hamdir said:
tegra optimization porting no longer works using chainfire, this is now a myth
did u manage to try shadowgun thd, zombie driver or horn? the answer is no, games that use t3 sdk for physx and other cpu graphics works can not be forced to work on other devices, equally chainfire is now outdated and no longer updated
Click to expand...
Click to collapse
Looks like they haven't updated chain fire 3d for a while as a result only t3 games don't work but others do work rip tide gp, dead trigger etc . It's not a myth but it is outdated and only works with ics and tegra 2 compatible games . I think I (might be) unfortunate too but some gameloft games lagged on tegra device that i had, though root solved it too an extent
I am not saying something is superior to something just that my personal experience I might be wrong I may not be
Tbh I think benchmarks don't matter much unless you see some difference in real world usage and I had that problem with tegra in my experience
But we will have to see if the final version is able to push it above Mali t 604 and more importantly sgx544
Turbotab said:
Sorry Steve, this is an Android forum, or where you too busy buffing the scratches out of your iPhone 5 to notice? I have full respect for the talents of Apple's engineers & marketing department, many of its users less so.
Click to expand...
Click to collapse
No I actually own a nexus 4 and ipad mini so I'm pretty neutral in googles/apples ecosystems and not wiping any scratches off my devices.

[INFO]Processor 101

New processors come out everyday and you are like oh my god which one do I buy which one??
Well here the answer to all your processor related queries!!
Qualcomm Snapdragon​
Qualcomm continues to do what Qualcomm does best – produce a range of high quality chips with everything that handset manufactures need already built in. This time last quarter, we were taking our first look at the upcoming Snapdragon 600 processors which would be replacing the older S4 Pro, another incredibly popular Qualcomm processor.
Qualcomm doesn’t use the exact specification for the Cortex A15, it licenses the architecture from ARM which it then implements into its own Krait CPU cores, the newest version of which, the Krait 300, has shown up in the new Snapdragon 600 SoC...
Since then, a range of handsets powered by Qualcomm’s newest chips have appeared on the market, the flagship Samsung Galaxy S4 and HTC One being the two most notable models which are both some of the best performing smartphones on the market. Performance wise, the Snapdragon 600 has proven to be a decent enough jump up from the previous generation, performing well in most benchmark tests.
We’ve also started to hear about a few devices featuring the lower end Snapdragon 400 and 200 chips, with a range of entry level processors using various ARM architectures heading to the market in the near future. So far this year high end smartphones have received the biggest performance improvements, but these new chips should give the midrange a much needed boost later in the year.
So whilst Snapdragon 600 is certainly the most popular high-end chip on the market right now, we’ve already started to see our first snippets at Qualcomm’s next big thing, the Snapdragon 800.
Click to expand...
Click to collapse
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
There’s been lots of official and unofficial data floating around over the past few months regarding this new chip, and from, what we can tell, it looks to be one powerful piece of tech. Qualcomm demoed some of the new chip’s improved 3D performance earlier in the year, and more recently we’ve seen a few benchmarks popping up for new devices, which place the Snapdragon 800 at the top of the benchmark scores come it’s release.
First, there was the Pantech IM-A880 smartphone, which scored an impressive 30133 in the popular Antutu benchmark, followed by the rumoured beefed up version of the Galaxy S4, and most recently the new Xperia Z Ultra which pulled in the most impressive score yet, a whopping 32173. We’ve also seen some more official looking benchmarks from AnandTech and Engadget which confirm the Antutu scores of above and around 30,000, and also gives us a good look at how the chip performs in a range of other tests. The conclusion — it’s a bit of a beast.
These notable benchmarks scores are no doubt down to the new higher clocked Krait 400 CPU cores and the new Adreno 330 GPU, which is supposed to offer around a 50% performance improvement over the already quick Adreno 320. The test results we’ve seen have shown that the Snapdragon 800s CPU is fine compared with the current crop of processors, but the chip really shines through when it comes to GPU performance, which has proven to be even quicker than the Tegra 4 and iPad 4 chips.
We’ve already seen that Qualcomm is taking graphics extra seriously with its latest chip, as the Snapdragon 800 became the first processor to receive OpenGL ES 3 certification and is compliant with all the big graphics APIs.
Quite a few upcoming top of the line handsets are rumored to be utilizing Qualcomm’s latest processor, including the Galaxy S4 LTE-A, Oppo Find 7, and an Xperia Z refresh as well, so the Snapdragon 800 is perhaps the biggest chip to look out for in the coming months
Click to expand...
Click to collapse
Exynos 5 Octa​
Moving away from Qualcomm, there was certainly a lot of hype surrounding Samsung’s octo-core monster of a processor. Upon release, the chip mostly lived up to expectations — the Exynos version of the Galaxy S4 topped our performance charts and is currently the fastest handset on the market. The SoC is the first to utilize the new big.LITTLE architecture, with four new Cortex A15 cores to provide top of the line peak performance, and four older low power Cortex A7s to keep idle and low performance power consumption to a minimum.
The chip is certainly one of the best when it comes to peak performance, but it has had its share of troubled when it comes to balancing power consumption and performance. If you’re in the market for the fastest smartphone currently around, then the Galaxy S4 is the one to pick right now, providing that it’s available in your region. It has the fastest CPU currently on the market, and its PowerVR SGX544 tri-core GPU matches that of the latest iPad. But with the Snapdragon 800 just around the corner, there could soon be a new processor sitting on the performance throne.
Looking forward, it’s difficult to see the Exynos retaining its top spot for much longer. Other companies are starting to look beyond the power-hungry Cortex A15 architecture, but Samsung hasn’t yet unveiled any new plans.
Click to expand...
Click to collapse
Intel Clover Trail+ and Baytrail​
Speaking of which, perhaps the biggest mover this year has been Intel, and although the company still isn’t competing with ARM in terms of the number of design wins, Intel has finally show off some products which will pose a threat to ARM’s market dominance.
Although we’ve been hearing about Clover Trail+ since last year, the chip is now moving into full swing, with a few handsets arriving which are running the chip, and some of the benchmarks we’ve seen are really quite impressive. Clover Trail+ has managed to find the right balance between performance and power consumption, unlike previous Atom chips which been far too slow to keep up with the top of the line ARM-based processors.
Then there’s Baytrail. Back at Mobile World Congress earlier in the year, Intel laid out its plans for its Clover Trail+, but we’ve already heard information about the processor’s successor. Intel claims that its new Silvermont cores will further improve on both energy efficiency and peak performance. It sounds great on paper, but we always have to take these unveilings with a pinch of salt. What we are most likely looking at with Baytrail is a decent performance improvement, which should keep the processor ahead of the current Cortex A15 powered handsets in the benchmarks, but energy improvements are likely to come in the form of idle power consumption and low power states, rather than saving energy at the peak performance levels
Click to expand...
Click to collapse
But Intel isn’t just interested in breaking into the smartphone and tablet markets with its new line-up of processors. The company is still very much focused on producing chips for laptops. One particularly interesting prospect is the confirmed new generation of Android based netbooks and laptops powered by more robust Intel processors, which could give Microsoft a real run for their money.
Intel has clarified that it will also be assigning the additional Pentium and Celeron titles to its upcoming Silvermont architecture as well as using it in the new BayTrail mobile chips. What this potentially means is a further blurring of the line between tablets and laptops, where the same processor technology will be powering a range of Intel based products. I’m expecting the performance rankings to go from Baytrail for phones and tablets, to Celeron for notebooks, and Pentium chips for small laptops, but this naming strategy hasn’t been confirmed yet. It’s also interesting to see where this will stack up with Intel’s newly released Haswell architecture, which is also aimed at providing power efficient solutions to laptops.
Taking all that into consideration, Baytrail has the potential to be a big game changer for Intel, as it could stand out well ahead of Samsung’s top of the line Exynos chips and will certainly rival the upcoming Qualcomm Snapdragon 800 processor. But we’ll be waiting until the end of the year before we can finally see what the chip can do. In the meantime, we’ll look forward to seeing if Clover Trail+ can finally win over some market share.
Click to expand...
Click to collapse
Nvidia Tegra 4 and 4i​
Nvidia, on the other hand, has had a much more subdued second quarter of the year. We already had many of the unveilings for its new Tegra 4 and Tegra 4i designs by the start of the year, and so far, no products have launched which are making use of Nvidia’s latest chips.
But we have seen quite a bit about Nvidia Shield, which will be powered by the new Tegra 4 chip, and it certainly looks to be a decent piece of hardware. There have also been some benchmarks floating around suggesting that the Tegra 4 is going to significantly outpace other Cortex A15 powered chips, but, without a significant boost in clock speeds, I doubt that the chip will be much faster regarding most applications.
Nvidia’s real strength obviously lies in its graphics technology, and the Tegra 4 certainly has that in spades. Nvidia, much like Qualcomm, has focused on making its new graphics chip compatible with all the new APIs, like OpenGL ES 3.0 and DirectX 11, which will allow the chip to make use of improved graphical features when gaming. But it’s unclear as to whether that will be enough to win over manufactures or consumers.
The Tegra 4i has been similarly muted, without any handsets yet confirmed to be using the chip and we haven’t really heard much about performance either. We already know that the Tegra 4i certainly isn’t aiming to compete with top of the line chips, as it’s only the older Cortex A9s in its quad-core, but with other processors already offering LTE integration, it’s tough to see smartphone manufactures leaping at Nvidia’s chip.
The Tegra 4 is set for release at the end of this quarter, with the Tegra 4i following later in the year. But such a delayed launch may see Nvidia risk missing the boat on this generation of processors as well, which may have something to do with Nvidia’s biggest announcement so far this year – its plan to license its GPU architecture.
This change in direction has the potential to turn Nvidia into the ARM of the mobile GPU market, allowing competing SoC manufacturers, like Samsung and Qualcomm, to use Nvidia’s graphics technology in their own SoCs. However, this will place the company in direct competition with the Mali GPUs from ARM and PowerVR GPUs from Imagination, so Nvidia’s Kepler GPUs will have shine through the competition. But considering the problems that the company had persuading handset manufacturers to adopt its Tegra 3 SoCs, this seems like a more flexible and potentially very lucrative backup plan rather than spending more time and money producing its own chips.
Click to expand...
Click to collapse
MediaTek Quad-cores​
But it’s not just the big powerhouse chip manufactures that have been introducing some new tech. MediaTek, known for its cheap lower performance processors, has recently announced a new quad-core chip named the MT8125, which will be targeted for use in tablets.
The new processor is built from four in-order ARM Cortex A7 cores clocked at 1.5Ghz, meaning that it’s not going to be an absolute powerhouse when it comes to processing capabilities. The SoC will also be making use of a PowerVR 5ZT series graphics chip, which will give it sufficient grunt when it comes to media applications as well, with support for full HD 1080p video playback and recording, as well as some power when it comes to games.
MediaTek chip
A fair bit has changed in the mobile processor space since we last took a look at the market earlier in the year. Here’s a round-up of all the mobile processor news for the second quarter of the year.
MediaTek is also taking a leaf out of Qualcomm’s book by designing the SoC to be an all in one solution. It will come with built in WiFi, Bluetooth, GPS and FM ratio units, and will also be available in three versions, for built-in HSPA+, 2G, or WiFi only variants. This should make the chip an ideal candidate for emerging market devices, as well as budget products in the higher-end markets.
Despite the quad-core CPU and modern graphics chip, the MT8125 is still aimed at being a power efficient solution for midrange and more budget oriented products. But thanks to improvements in mobile technologies and the falling costs of older components, this chip will still have enough juice to power through the most commonly used applications.
Early last month, MediaTek also announced that it has been working on its own big.LITTLE architecture, similar to that found in the Samsung Exynos 5 Octa. But rather than being an eight core powerhouse, MediaTek’s chip will just be making use of four cores in total.
The chip will be known as the MT8135 and will be slightly more powerful that the budget quad-core MT8125, as it will be using two faster Cortex A15 cores. These power hungry units will be backed up by two low power Cortex A7 cores, so it’s virtually the same configuration as the Exynos 5 Octa but in a 2-by-2 layout (2 A15s and 2 A7s) rather than 4-by-4 (4 A15s and 4 A7s).
But in typical MediaTek fashion, the company has opted to down clock the processor in order to make the chip more energy efficient, which is probably a good thing considering that budget devices tend to ship with smaller batteries. The processor will peak at just 1Ghz, which isn’t super slow, but it is nearly half the speed of the A15s found in the Galaxy S4. But performance isn’t everything, and I’m more than happy to see a company pursue energy efficiency over clock speed and number of cores for once, especially if it brings big.LITTLE to some cheaper products.
Click to expand...
Click to collapse
Looking to the future​
ARM Cortex A57​
If you fancy a look even further ahead into the future, then we have also received a little bit of news regarding ARM’s successor to the A15, the all new Cortex A57. This new top of the line chip recently reached the “tape out” stage of development, but it’s still a way off from being released in any mobile products.
Cortex A50 performance chart
The Cortex A50 series is set to offer a significant performance improvement. Hopefully the big.LITTLE architecture will help balance out the power consumption.
ARM has hinted that its new chip can offer up to triple the performance of the current top of the line Cortex-A15 for the same amount of battery consumption. The new Cortex-A57 will also supposedly offer five times the amount of battery life when running at the same speed as its current chips, which sounds ridiculously impressive.
We heard a while back that AMD was working on a Cortex A57/A53 big.LITTLE processor chip as well, which should offer an even better balance of performance and energy efficiency than the current Exynos 5 Octa. But we’ll probably be waiting until sometime in 2014 before we can get our hands on these chips.​
The age of x64​
Speaking of ARM’s next line-up of processors, another important feature to pay attention to will be the inclusion of 64 bit processing technology and the new ARMv8 architecture. ARM’s new Cortex-A50 processor series will take advantage of 64 bit processing in order to improve the performance in more demanding scenarios, reduce power consumption, and take advantage of larger memory addresses for improved performance.
We’ve already seen a few mobile memory manufactures talk about production of high speed 4GB RAM chips, which can only be made use of with larger 64 bit memory addresses. With tablets and smartphones both in pursuit of ever higher levels of performance, x64 supported processors seem like a logical step.
So there you have it, I think that’s pretty much all of the big processor news over the past 3 months. Is there anything in particularly which has caught your eye, are you holding out for a device with a brand new SoC, or are the current crop of processors already plenty good enough for your mobile needs?
Click to expand...
Click to collapse
Reserved
Great thread, Again.:good:
This is better suited for the general General forum. But good job anyway.
Good job, mate!
Nicely written. I enjoyed reading that.
Sent from my GT-I9500 using Tapatalk 4 Beta
Well done. Good read :thumbup:
TEAM MiK
MikROMs Since 3/13/11

Categories

Resources