question cpu power - Asus Transformer TF700

I posted this in the general forum but did not get an answer, so posting here hoping for a reply. Sorry if this is breaking forum rules, I doing think it is but....if it is flame away and delete ...anyways. I am just curious with the introduction of quad core tablets, how do they match up to similar spec CPU in raw power. I understand that android, iOS, and windows ( in the future) are mobile OS, So directly comparing the to a laptop is useless. I did however notice that the new t33 clocked and 1.6ghz is only .1 slower than my laptop to with is running a AMD quad core at 1.7ghz. So I'm just wondering is it a direct comparison in just processing power alone or is the architecture so different in the laptop and desktop that even at the same speed they win in the power category .

Totally different. Due to the ARM architecture, the CPU is a lot less powerful than comparably clocked CPUs using the x86 or x86_64 architecture.

Keep
jdeoxys said:
Totally different. Due to the ARM architecture, the CPU is a lot less powerful than comparably clocked CPUs using the x86 or x86_64 architecture.
Click to expand...
Click to collapse
Thank you for the reply. Do you have any idea on the scale? How fast would a arm CPU have to be clocked to equal a x86 or x64?

fd4101 said:
Keep
Thank you for the reply. Do you have any idea on the scale? How fast would a arm CPU have to be clocked to equal a x86 or x64?
Click to expand...
Click to collapse
There is literally no comparison. I have a crap old AMD Athlon 64 x2 clocked at around 3 ghz with ddr2 RAM (lolwut, in 2012?). It gets 3x better sunspider scores than my infinity does. I don't know if that's the browser or what but, I think an ARM CPU would have to be at least 5-6 times higher clocked to get similar performance from x86 CPUs. For modern day ones, I think maybe even up to 10-20x. Of course, this is just my talking out of my ass here, I don't really know the exact numbers.

Well I guess my dreams of have a tablet that is truly as powerful as my laptop are still far off. But with the way tech is progressing I'm sure we'll have it someday..

fd4101 said:
Well I guess my dreams of have a tablet that is truly as powerful as my laptop are still far off. But with the way tech is progressing I'm sure we'll have it someday..
Click to expand...
Click to collapse
Why? You didn't mean to play Diablo 3 on it, did you? The apps for tablets take this difference into account, so it is really a question of what apps suit your needs.
(BTW, my i3 laptop is only 4 times faster than Chrome on the Infinity running ICS, it will be probably only 3 times faster when the JB for the Infinity shows up; and we don't really need all the CPU power of i3/i5/i7 for casual web browsing..)

fd4101 said:
Well I guess my dreams of have a tablet that is truly as powerful as my laptop are still far off. But with the way tech is progressing I'm sure we'll have it someday..
Click to expand...
Click to collapse
Ms surface. Core i5. Although it won't have quite the god tier 16:10 resolution of the infinity.
I got b& at /g/ for sh!tposting ;_;

ARM Cortex-A9(same as Tegra3) is in between Intel's Atom and their Desktop x86 CPUs.
A dual core Cortex-A9 is considerably faster than an Intel Atom N270 in some operations. However, it is difficult to really compare as few benchmarks are optimized for both ARM *and* x86.
The ARM architecture's primary focus is low power while being inexpensive so it will be slower than Intel's x86 by design.

Although I realize that the power of tablets have along way to go before they are playing AAA games. I would like tablets to get to a point where they can run the same level of software ( optimized for mobile of course). Desktops will always be more powerful but as it stands right now my laptop can play pretty much any game my desktop can just on lower settings. I would like a tablets to replace this. The benefits of course lower power requirements for battery life and better mobility. I thought that with quad core tablets with ghz reaching closer and closer to laptops that we where getting close but I did not know enough about x86 & x64 to know it made so much of a difference. I need to take a computer class .
I know that the cloud can give the illusion of tablets having more power than they do, but the cloud has along way till it can be fully realized to many restrictions as it stand now. Even with tablets having 4g connection it still limits mobility through contracts, deadzones, lag and makes you pay more multiple times to do what you want. Maybe in the future the cloud will make all this a wash and well all carry thin lower power devices that only need to decode video and receive input, but I see that as along way away.

fd4101 said:
Although I realize that the power of tablets have along way to go before they are playing AAA games. I would like tablets to get to a point where they can run the same level of software ( optimized for mobile of course). Desktops will always be more powerful but as it stands right now my laptop can play pretty much any game my desktop can just on lower settings. I would like a tablets to replace this. The benefits of course lower power requirements for battery life and better mobility. I thought that with quad core tablets with ghz reaching closer and closer to laptops that we where getting close but I did not know enough about x86 & x64 to know it made so much of a difference. I need to take a computer class .
I know that the cloud can give the illusion of tablets having more power than they do, but the cloud has along way till it can be fully realized to many restrictions as it stand now. Even with tablets having 4g connection it still limits mobility through contracts, deadzones, lag and makes you pay more multiple times to do what you want. Maybe in the future the cloud will make all this a wash and well all carry thin lower power devices that only need to decode video and receive input, but I see that as along way away.
Click to expand...
Click to collapse
You better hope this never happens. The cloud = gigantic botnet. Google will take ALL your information and beam ideas directly to your head.

Lol well i m not sure anything can stop it but I'll start stocking up on tin foil, I'll make you a hat and ship it to you.

It's hard to say but in terms of gaming we are seeing some quite interesting developments. For example Max Payne and GTA 3 on a tablet is quite impressive if you think what kind of PC you had to own when this games were released.
Sent from my Galaxy Nexus using xda premium

And there's Baldur's Gate for Android coming

d14b0ll0s said:
And there's Baldur's Gate for Android coming
Click to expand...
Click to collapse
That's already possible since years.
https://play.google.com/store/apps/details?id=net.sourceforge.gemrb
Sent from my Galaxy Nexus using xda premium

That is true games have come along way through optimization. Maybe games with just be better optimized and hardware won't be such a concern.

Nebucatnetzer said:
That's already possible since years.
https://play.google.com/store/apps/details?id=net.sourceforge.gemrb
Click to expand...
Click to collapse
Looks cool, but the comments say otherwise. I meant the version intended for Android tablets.

d14b0ll0s said:
Looks cool, but the comments say otherwise. I meant the version intended for Android tablets.
Click to expand...
Click to collapse
I tried it quite a while ago so I don't know if anything has changed.
Sent from my Galaxy Nexus using xda premium

Basically they say it's slow even on some fast phones, gets FCs and you can't go into some menus without doing some crazy tricks. But it's nice to see some development like this. I'm still waiting for the official version, with some fright too, as it's pretty time-consuming..

Related

Dual core processor?

Why would a phone need it? Wouldn't battery life just suck?
Sent from the key to my world.
Sure, if you want a portable console lol.
The response speed would be great thought, and camera will be able to record in full HD without trouble. But, the software will need to be programmed to take advantage of the dual-core processor.
As for the battery, not necessary. The cpu will throttle back its speed a lot, and a dual-core might be able to drop really low and remain fully operational which will require less battery. Also the new dual-core cpu nanometer architecture would most probably be lower which means better battery consumption but at full load (like when playing graphically intensive games) battery probably won't last long. Still thought, new battery technology will need to be manufactured soon to keep up with this new phone technology. Next you'll see are dual-gpu phones lol
I'm waiting for the 2011 CES to see if anything dual-core will be announced before dropping $800 on a phone as I would love such a device, just for fun.
CES is just next week right?
They've already announced one phone to run it, I just think technology is getting crazy with portability. My computer still has a 1.6ghz processor, these new phones will undoubtedly surpass my poor system. Ha.
Sent from the key to my world.
One thing that the makers of the chips take into consideration, is power usage. And it's easy to see that too. I'll use desktop cpus and laptop cpus for example. Intel and AMD's 6 core designed both have a TDP of under 125W. Old single core pentiums had a TDP higher than that, and were much bigger in nm range. Laptop cpus now only use at the most, 1/4 the tdp of a desktop cpu.(Not as fast though)
Other than that, right now I can bet that there is no multi-threaded apps available, and is Android really able to take advantage of a multi-core system? Probably not on it's own.
HAPPY NEW YEAR people!!
Yeah, CES is just next week. I know they announced some phone but I would like to know when they are coming so I know if I should buy the best thing right now or not.
I wouldn't have a clue if Android can handle multicore processors but maybe the new Honeycomb version of Android will enable this? If this is the case then maybe this phones will come March/April....sigh
And yeah, TDP of this chips will be lower then current chips. I bet they are working hard to make the best use of the battery.
ceg1792 said:
Why would a phone need it? Wouldn't battery life just suck?
Sent from the key to my world.
Click to expand...
Click to collapse
A multi core cpu does not necessarily use more power than a single core cpu; it's mostly dependent on the architecture.
NVIDIA talks about benefits of dual core:
http://www.engadget.com/2010/12/08/nvidia-touts-the-benefits-of-multi-core-processors-for-smartphon/
I think there is a definite need for Dual-Core Processors in phones. Gaming is making a mainstream shift from dedicated handheld gaming consoles to Smartphones. In order for developers to make more robust and graphically appealing games, they are going to need more processing power. Another point is that Dual-Core Processors will help browser rendering speeds. With HSPA+, WiMax, and LTE we are getting some serious downlink on our devices. But if you notice, a smartphone getting 3mbps down and one getting 10 mbps down renders a webpage at the same speed. Right now the processor bottlenecks webpage rendering, not our data connection. With these faster processors it helps eliminate the bottleneck to provide a gratifying web experience to the end-user.
It'll help if the application has multi-thread support. But if the app can only use 1 core/thread, then that's where dual core is useless. Also gaming isn't the main focus of Smartphones, there's probably a huge minority of people using their Smartphones as a serious gaming machine compared to people who are using their smartphone for work, talk, text, or other multimedia.

Nvidia quad core cpu and 12 core gpu...WTF?!?!

here it is folks...this is what i read...
http://www.androidcentral.com/nvidi...ampaign=Feed:+androidcentral+(Android+Central)
yeah its cool and good for all the innovations but battery life? if decent battery life is to be had i think the optimization and battery size are going to be some kind of amazing lol
CTR01 said:
here it is folks...this is what i read...
http://www.androidcentral.com/nvidi...ampaign=Feed:+androidcentral+(Android+Central)
yeah its cool and good for all the innovations but battery life? if decent battery life is to be had i think the optimization and battery size are going to be some kind of amazing lol
Click to expand...
Click to collapse
They said 12 hours of HD video.
If it's going by their average designs, then it's not actual cores that you would see in a cpu. Most likely it's some type of shader core that you see in their GPU setups. More or less, a shader cluster from fermi.
muyoso said:
They said 12 hours of HD video.
Click to expand...
Click to collapse
12 hours huh thats freaking awesome lol!
vbetts said:
If it's going by their average designs, then it's not actual cores that you would see in a cpu. Most likely it's some type of shader core that you see in their GPU setups. More or less, a shader cluster from fermi.
Click to expand...
Click to collapse
hmm well that does sound interesting indeed! we will see : )
Just to add, not saying it will be a shader cluster from fermi, just an example of what it could be. Either way, it's looking pretty great. ARM has come a long way, and I bet soon we'll see competition to x86-64 platforms.
vbetts said:
Just to add, not saying it will be a shader cluster from fermi, just an example of what it could be. Either way, it's looking pretty great. ARM has come a long way, and I bet soon we'll see competition to x86-64 platforms.
Click to expand...
Click to collapse
yeah that would be pretty cool...could have like some more in depth or advanced phone to computer interaction or something lol
small related update...
http://phandroid.com/2011/02/16/nvidia-quad-core-tablets-in-august-phones-at-christmas/
benchmarks and possible release dates
lol?? fake?
Rumors yes but fake eh I say plausible
HTC HD2 w/ 2.3 : )
inubero said:
lol?? fake?
Click to expand...
Click to collapse
Those are benchmarks straight from Nvidia. That link even gives Nvidia as its source.
http://blogs.nvidia.com/2011/02/teg...-chip-worlds-first-quadcore-mobile-processor/
In all fairness to the benchmark, a T7200 (65nm) is a 5yo chip. So what you essentially have here are TWO extra cores build probably on a much smaller 30nm chip to edge out a 5yo dual-core desktop CPU.
So for a mobile chip to beat out something like a Q6600 (2.4ghz quad-core, 45nm), it would need 8 cores (or octa-core) and pack tightly in a 20nm or so chip to churn out an equivalent number. I don't think that's going to be available for another 5-7yrs or so.
lude219 said:
In all fairness to the benchmark, a T7200 (65nm) is a 5yo chip. So what you essentially have here are TWO extra cores build probably on a much smaller 30nm chip to edge out a 5yo dual-core desktop CPU.
So for a mobile chip to beat out something like a Q6600 (2.4ghz quad-core, 45nm), it would need 8 cores (or octa-core) and pack tightly in a 20nm or so chip to churn out an equivalent number. I don't think that's going to be available for another 5-7yrs or so.
Click to expand...
Click to collapse
Q6600 is 65nm
Not sure what the dye size is of the cores inside of Tegra, but definitely smaller and also uses a hell of a lot less power. If it were an x86 chip comparing to an x86 chip, then comparing it to a 5 year old cpu would be a very bad comparison. But it's ARM, and ARM has come a long way in a quick time. Nvidia is great at pushing limits too, Fermi may have used a ton load of power and put out heat, but it performs great, and a lot of the fermi based cards are smaller than the counter HD5 and HD6. Nvidia Ion is also easily cooled, and doesn't use a lot of power or put out a lot of heat.
How much horse power does a mobile device really need? I voted for a combination of three, but I would be inclined to lean towards battery life.
A2Aegis said:
How much horse power does a mobile device really need? I voted for a combination of three, but I would be inclined to lean towards battery life.
Click to expand...
Click to collapse
im sure you know but after a certain point people buy things just cuz they can and to brag about it lol
...although if you have non geeky/techy friends then it really means nothing to them so yeah i see it as at some point its just no use
...unless you get like a 3d hologram/projector thing going on...then we can talk 4 cores or more.
vbetts said:
Q6600 is 65nm
Not sure what the dye size is of the cores inside of Tegra, but definitely smaller and also uses a hell of a lot less power. If it were an x86 chip comparing to an x86 chip, then comparing it to a 5 year old cpu would be a very bad comparison. But it's ARM, and ARM has come a long way in a quick time. Nvidia is great at pushing limits too, Fermi may have used a ton load of power and put out heat, but it performs great, and a lot of the fermi based cards are smaller than the counter HD5 and HD6. Nvidia Ion is also easily cooled, and doesn't use a lot of power or put out a lot of heat.
Click to expand...
Click to collapse
Thank you for the correction! You'd think I would remember my own's CPU dye, right?
My previous post doesn't have any ill-intention at all. There are a lot of people who think this will replace future desktops (as noted in the recent article on the sales of smartphones/tablets beat out home office computers), so my somewhat horrible analogy was to paint a picture between these two different architectures and where they're at. I can almost remember back in the day when people laughed at the idea of Nvidia branching out to making energy-efficient mobile processors. They wont be laughing now
lude219 said:
Thank you for the correction! You'd think I would remember my own's CPU dye, right?
My previous post doesn't have any ill-intention at all. There are a lot of people who think this will replace future desktops (as noted in the recent article on the sales of smartphones/tablets beat out home office computers), so my somewhat horrible analogy was to paint a picture between these two different architectures and where they're at. I can almost remember back in the day when people laughed at the idea of Nvidia branching out to making energy-efficient mobile processors. They wont be laughing now
Click to expand...
Click to collapse
I honestly wouldn't be surprised if ARM replaced x86-64 in at least mobile solutions like laptops. Windows has ARM support coming soon. But the real question here is, can companies like Nvidia or whoever makes the SoC make the chip cheap enough for mass production and retail? AMD sells quad core cpus for less than $100, and Intel sells cpus with a gpu onboard the CPU chip for under $150.
hmm thats true and amd is coming out with their cpu/gpu integrated stuff soon-ish and since they usually sell cheaper than intel...at that price point it would be kinda hard to beat...least for a little while i think.

Dual Core = Overkill

I know i'm gonna get burned at the stake for this one, since this is a tech forum, but dual core is just overkill AT THE PRESENT MOMENT. It's like computers. They are all now dualcore, most come with almost 4 gigs of ram. What in the hell would 95% of the population need AT THE MOMENT with something more powerful than that? LIke a quadcore with 8 gigs? NOTHING. It's just a ploy to get more money. Our 1ghz phones can run everything just fine. This isn't like the early days of android where it always felt like more ram and raw power was needed. We have hit a plateau where the current cellphone landscape fits MOST peoples needs. Can i really be the only one who thinks that it's just unnecessary?
Remember, xda only represents .0000000001% of actual real world use. I am talking about the layman who is actually gonna fall for the "OMFG ITS GONNA DO EVERYTHING SO MUCH BETTER AND FASTER", um no it's not. Most people dont even max out there current hardware.
Edit: Seriously people get a grip on reality. I'm not pushing my views on anyone. It's a ****ing forum, you know, one of those places where people discuss things??? The debate that has come out of this has been fantastic, and i have learned alot of things i didnt know. I'm not gonna change my original post to not confuse people reading the whole topic, but i can now understand why dual core does make some sense. Quit attacking me and making stuff so personal, it's uncalled for and frankly i'm about to ask a mod to close this topic cause it's getting so ridiculous. Learn how to have a debate without letting all the emotion get in the way or GTFO. YOUR the one with the problem, not me.
Xda doesn't care. We like specs, maxing out our devices, and most of all, benchmarking
redbullcat said:
Xda doesn't care. We like specs, maxing out our devices, and most of all, benchmarking
Click to expand...
Click to collapse
Well as do i! I'm talking about the uneducated masses.
more cores mean;
more threads
meaning better apps
meaning better FPS
meaning HD everything
meaning more capabilities
meaning more fun with less devices.
Do you remember the days you had a cell phone, a PDA, an MP3 player, a digital camera AND a laptop? All that was missing is your bat symbol and cape. I like not having to have a utility belt of gadgets on my person.
I would rather see them work on battery saving and density technologies to eventually allow for one week [heavy usage] times.
iamnottypingthis said:
I would rather see them work on battery saving and density technologies to eventually allow for one week [heavy usage] times.
Click to expand...
Click to collapse
Hard for you to believe, i know, but that's what having a multi-core does, it helps improve battery life (both in standby and in usage). Sure it's not a definitive answer to our battery problems, but it's a first.
Hey Lude219, I thought I'd post this as I thought you gave a good explanation on battery life and usage (fifth one down).
It really all comes down to the person's requirements. If someone requires to run several apps at once, or requires to watch movies at a higher frame rate, or requires to have the 'best phone on the market', then they'll buy a dual-core phone, no-one else will care (much). Most people I talk to agree and think that Dual-Core in a phone is unnecessary ('dual-core phone' it even sounds ridiculous lol), but, I must admit that I was surprised at how laggy my DHD was out the packet, and don't get me wrong, I know once it's rooted it will be much better just because the SW is cleaner, but most people will not even contemplate rooting their phone, so if it's not an option for them, dual-core will surely help.
Dual-core procs don't have a higher power consumption than single-core procs (or at least they won't if they design/implement them properly), so it shouldn't (fingers crossed) make power consumption any worse.
Personally, I'd also rather they put they're time and effort into making better batteries and improving general power consumption.
It'll be the next marketing point after the dual-core hype has ebbed (Now with Three Days Standby!! YEY!!)
Well i think most people who do buy these "powerful" devices have one important reason to buy, and that is to future proof themselves. But ey, i'm looking at the perspective of a tech savy guy, I suppose the masses simply want the next best thing.
But you are right however, it is a ploy to make money, but everything in business is, so there's no difference between dual core, one core, 8 mp camera, 5 mp, 720p. 1080p, it's all business. If there was no business then.. well, where'd we get our smartphones?
lude219 said:
Hard for you to believe, i know, but that's what having a multi-core does, it helps improve battery life (both in standby and in usage). Sure it's not a definitive answer to our battery problems, but it's a first.
Click to expand...
Click to collapse
I can easily go into why you're wrong, but I won't waste the calories. Other things besides just adding a core are done to get those gains. If more cores equaled more power savings, ULV cpus would be octo-core.
Just a matter time when they get battery life ironed out in smartphones and to the OP i would agree in some aspect, but they are smartphones why not just keep improving them. Else if someone never thought outside box we would still stuck with dumb phones =no fun.
here a link for next gen snap dragons sounds promising.
I won't lie, right now dual core is overkill. But in time like everything else has computer wise, it will be the normal and will be the way all devices go, that's not just considering dual core. I'm talking pure multicore threading. It's not just the number of cores you're buying as well, it's the difference core to core when you compare say arm cortex a8 to the Tegra II's Arm Cortex a9, single core the a9 will be faster and more efficient and also produce less heat thanks to the die shrink, which then also means less power draw per core. Right now for phones, dual core is futureproofing a bit for when we do have android that is fully multithreaded, and apps that are as well.
There's also something you need to remember, XDA isn't really a big fraction of people using android devices and what not, but not every android user is on XDA. I also disagree with everyone maxing out their hardware, just running my Evo with a few of the aosp live wallpapers my evo runs terrible, and web browsing isn't the greatest either depending on the website.
Oh dude you should so post this one overclock.net, the beat down you would get would be hilarious. But anyway back one topic, as for phones, well for some people dual core is nice, for example me and my friends, when we head off to lecture, all we can do is browse the web on our phones, all of us, for some odd reason like to have at least 6-8 tabs open at the same time and for the phones we have (I have an iphone 3gs, theres a couple captivates, Droid Inc 2, and some others), they sometimes tend to slow down with all of the tabs open. Also when you open up numerous applications, you have to sometimes close out of some of them because the one that is open starts to slow down. Thats a couple reasons that dual core is nice, with massive multitasking. But with the computer part, where you say that no one needs a quad core processor, well think about it, there are a lot of people who want performance (not just XDA, theres overclock.net, techpowerup, EVGA, HardOCP, etc) and just random people who want fast computers for reasons such as video processing, gaming (this is probably a big reason), ridiculous multitasking (I fall into this category cause I have over 125 tabs open in chrome right now and I actually needed to upgrade to 8 gb's of ram because it was saying I was running out of ram with only 4), and some people that want just plain snappiness from their computer. So I would not say that a quad core processor is overkill for most people as the demographic I mentioned above does include a decent amount of people.
Oh and I forgot to mention watching Hi def videos, your average intel integrated graphics card cannot play a 1080p video without issues so thats why you might need a faster processor and a faster GPU to play those videos in an HTPC.
But yes for your average everyday joe, a simple nehalem based dual core would suffice for everyday tasks such as web browsing and such but it cannot do much else.
xsteven77x said:
I know i'm gonna get burned at the stake for this one, since this is a tech forum, but dual core is just overkill AT THE PRESENT MOMENT. It's like computers. They are all now dualcore, most come with almost 4 gigs of ram. What in the hell would 95% of the population need AT THE MOMENT with something more powerful than that? LIke a quadcore with 8 gigs? NOTHING. It's just a ploy to get more money.
Click to expand...
Click to collapse
Which is why netbooks took off for a while there (until people realized those were a bit too slow)
Our 1ghz phones can run everything just fine. This isn't like the early days of android where it always felt like more ram and raw power was needed. We have hit a plateau where the current cellphone landscape fits MOST peoples needs. Can i really be the only one who thinks that it's just unnecessary?
Click to expand...
Click to collapse
I completely disagree. The difference between dual and single core for mobile devices is *huge*. There is a *huge* difference between everything running "fine" and everything running "great". The biggest difference is for games and web browser, which most people absolutely care about. There is also the wide range of more powerful apps it enables, which for now is more important on the tablet, but that will come to phones as well.
Dual core is not overkill, for one, its future proofing your phone, most ppl buy the phones on contract and in a couple of months dual cores will be the standard for high end smartphones, second, it allows for better GPU performance which leads to better games and overall experience, there are many benefits to it, too many for me to list...
iamnottypingthis said:
I can easily go into why you're wrong, but I won't waste the calories. Other things besides just adding a core are done to get those gains. If more cores equaled more power savings, ULV cpus would be octo-core.
Click to expand...
Click to collapse
Yea, it's better if you don't, because I dont think you have any substantial knowledge on the matter to go against the research and knowledge of all the computer engineers out there. The reason why it's not octo-cores yet is because it's called a BUSINESS. But I wont waste the calories in telling you why that is until you go and read up on "economy of scales."
It'll be interesting at least to see what develops. See if they'll start doing proper separate GPU Die's or if they'll dedicate GPU cores on the proc (i.e quad core chip with 2 CPU cores and 2 GPU cores).
Hope people don't start to get burnt when they begin maxing out/overclocking their cores.
Funny, if you stop developing you get nothing because you are satisfied with nothing.
Us at XDA are techies and you give us more core more ram more battery we will figure what to create with the new abilities. That is how progress is done.
As far as the masses, let marketing depts do their thing to them........we do not care, never did. As for me, I have a 12 core motherboard with 32 gigs of ram.etc and I jack it to 85% demand almost every day, and I am sure that there are very very few computers that have this capabilities.
The funny thing more innovation make more efficiencies my computer under a full load uses less than most of the gaming rigs out there and has 50% more muscle.
On the phone dual core allow one to create algorithms that will make the battery use way more efficient.
More cores more ram === win win win for everyone, but us in XDA and other forums like this it is just great great great for us.......... don't worry we will use what ever is created 110% and make it better.
If dual core in your Nokia 3210, yes it's overkilling, but if dual core in your cad workstation, it's been overkilled. All depends on the user, usage, and design of the device.
Actually it's an arueable question whether dual-core cpus are an overkill today, they have several advantages but most of those can be applied to netbooks and tablets rather than phones.
1. When there are several CPUs, multi-threaded applications can be really run concurrently (and basically, even if one application is performing, the scheduling overhead for multi-core system is lower and background tasks like gui/hardware drivers can be executed on a separate core).
2. Another use case (although this is a misuse and abuse of CPU anyway) is the use of multi-core systems for encoding/decoding media. It brings absolutely no advantages to the end user, but when the CPU is powerful enough to handle the media stream, one may use it instead of a proper DSP processor which Google will likely be doing for VP8/WebM
3. SMPs can be useful in tablets and netbooks - for example, tegra2 will outperform intel atom in most cases (first of all, it is dual-core. and secondly, it has a very powerful GPU). I am personally using debian on my tablet (in chroot though) and many people are using ubuntu on toshiba ac100 - arm SoCs are a fun to hack and give an incredible battery life. But this is IMHO only acceptable for geeks like us and I think dual-core (or x-whatever-core) ARM CPUs will be useful for consumers (hate this word but whatever) if some vendor releases a device which will run a full-fledged linux distro with LibreOffice, math packages like octave/maxima, development environments like kdevelop so that it can be used as an equal replacement of an x86 netbook.
As for the popular arguement about power consumption - surprisingly, but there is little correlation between the number of cores and power drain. Newer SoCs are more energy efficient because they have improvements in technical process (literally the length of wires inside the chip), more devices are integrated into one chip, more processing blocks can be put to sleep states. Even if you compare a qualcomm qsd8250 running at 1GHz with a GPU enabled, it will use less power than an old 520 MHz intel pxa270. Besides, as I have already mentioned, a multiprocessor system can execute tasks concurrently which means that the computation will take less time and the processor will spend more time in a power-saving state.
Basically multi-cores are a popular trend and is a good way to make consumers pay for new toys. For me personally the reasons to change a device have always been either the age of the device (when it literally began to fall apart) or the real improvements in hardware (I updated from Asus P525 to Xperia X1 because ever since I had my first pda I was frustrated by the tiny 32 or 64 mb ram and awful screens with large pixels that were really causing pain in eyes if one used them for long) but unfortunately the situation now is the same as it is in the desktop world - software quality is getting worse even faster than hardware improves. Hence we see crap like java and other managed code on PDAs and applications that require like 10 Mb ram to perform simple functions (which were like 100 Kb back in winmo days). I do admit that using more ram can allow to use more efficient algorithms (to reduce their computational complexity) and managed code allows for higher portability - but hey, we know that commercial software is not developed with the ideas of efficiency in mind - the only things corporations care about are writing the application as quick as possible and hide the source code.
lude219 said:
Yea, it's better if you don't, because I dont think you have any substantial knowledge on the matter to go against the research and knowledge of all the computer engineers out there. The reason why it's not octo-cores yet is because it's called a BUSINESS. But I wont waste the calories in telling you why that is until you go and read up on "economy of scales."
Click to expand...
Click to collapse
That and yields for Nehalem 8 cores aren't so high. Bulldozer yields are working out okay so far, but then again it's not a real 8 core cpu...

Amazon tablet "with Intel innards"

http://technolog.msnbc.msn.com/_news/2011/08/26/7485225-is-this-what-amazons-tablet-will-look-like
The news today is that Amazon will soon be selling a new Android powered tablet at a very attractive price. According to this... http://www.technobuffalo.com/technobuffalo/opinion/noahs-top-5-gadgets-coming-soon/ "Amazon’s tablet computer will run Android but feature Intel-based innards, and not an Nvidia Tegra processor"
Forgive my ignorance, but are there any devices shipping right now running Android on an Intel platform? What do those that know much more than me about the subject think an Intel powered Android tablet means to XDA?
good price,i am looking forward to see it。
Spyvie said:
Forgive my ignorance, but are there any devices shipping right now running Android on an Intel platform?
Click to expand...
Click to collapse
There is no "Intel" platform. There's ARM and x86 and MIPS, and a few others. PCs (including Macs) use x86, phones and tablets like iPad and Android-based ones are ARM, routers are usually MIPS.
If this Amazon tablet will have Intel-innards, it probably means Atom processor, which means the first Android tablet using x86 instead of ARM (Atom is a x86 processor).
Spyvie said:
What do those that know much more than me about the subject think an Intel powered Android tablet means to XDA?
Click to expand...
Click to collapse
Main thing, a much higher possibility to be able to run a traditional Linux distro on it. Beyond that, we'll have to see.
Hmm.. I'd rather see an arm cortex-a9. Arm seems to work really well in phones and tablets and the arm9 is just as fast as the intel if I remember correctly.
sk8aseth said:
Hmm.. I'd rather see an arm cortex-a9. Arm seems to work really well in phones and tablets and the arm9 is just as fast as the intel if I remember correctly.
Click to expand...
Click to collapse
It is clock per clock but intel processors scale higher
I thought intel mobile platform running honeycomb wasn't performing too well last i heard. Battery life is a huge concern too
sk8aseth said:
Hmm.. I'd rather see an arm cortex-a9. Arm seems to work really well in phones and tablets and the arm9 is just as fast as the intel if I remember correctly.
Click to expand...
Click to collapse
At lower clocks yes, but as mentioned before, once you start increasing the power usage and clocks, that's when arm can't keep up and when x86 shines. ARM is normally only more powerful than x86 CPUs while the tdp is under 1 watt, which is the case for most mobile devices. However try scaling an arm CPU to over 125 watts (average x86 tdp) and the scaling is horrible. They are two different platforms built for two different reasons. BUT what I'm thinking Intel is doing here is that they are going to maybe try and muscle their way into the ARM market. With the announcement that windows 8 will support ARM, I think this chip will be their "test" run. If they do get into the market, it will open up an entire new horizon for them (hope that sounds right). This would especially help with the ultrabook concept they are working on.
Do the Android netbooks run x86?
Sent from my Galaxy Tab using Tapatalk
rustyshack3 said:
Do the Android netbooks run x86?
Click to expand...
Click to collapse
The one netbook with Android I saw a while ago was x86, yes. It was a dual-boot Android/Windows machine. Though Android on a netbook makes no sense. None whatsoever.
However, there are ARM-based Android netbooks too, like the Toshiba AC100.
dreadlord369 said:
At lower clocks yes, but as mentioned before, once you start increasing the power usage and clocks, that's when arm can't keep up and when x86 shines. ARM is normally only more powerful than x86 CPUs while the tdp is under 1 watt, which is the case for most mobile devices. However try scaling an arm CPU to over 125 watts (average x86 tdp) and the scaling is horrible. They are two different platforms built for two different reasons. BUT what I'm thinking Intel is doing here is that they are going to maybe try and muscle their way into the ARM market. With the announcement that windows 8 will support ARM, I think this chip will be their "test" run. If they do get into the market, it will open up an entire new horizon for them (hope that sounds right). This would especially help with the ultrabook concept they are working on.
Click to expand...
Click to collapse
You can actually thank apple for lighting a fire under intel for the mobile platform when they told them that their processor isn't cutting it and that they will use ARM based processors in their laptop.
For whatever the reason, apple is burning a lot of bridges and somewhat putting all of their egg in one basket before confirming that TSMC's yield is good enough to meet their demands
ph00ny said:
You can actually thank apple for lighting a fire under intel for the mobile platform when they told them that their processor isn't cutting it and that they will use ARM based processors in their laptop.
For whatever the reason, apple is burning a lot of bridges and somewhat putting all of their egg in one basket before confirming that TSMC's yield is good enough to meet their demands
Click to expand...
Click to collapse
Yea I actually remember reading about that. I believe that they mentioned that their next gen would use ARM based CPU's or is my head making that up? Either way, this should help intel in the long run (if their ARM line is successful). Yea and they did the same thing with Samsung, they stopped using their chips and parts. I'm guessing they want to make everything in-house, I don't know how that's gonna work out, but hey, they will probably pull through judging from past actions and events.
EDIT: I realized something, they don't want to keep everything in house, but rather, they don't want major competitors (such as samsung) making parts for them.
dreadlord369 said:
Yea I actually remember reading about that. I believe that they mentioned that their next gen would use ARM based CPU's or is my head making that up? Either way, this should help intel in the long run (if their ARM line is successful). Yea and they did the same thing with Samsung, they stopped using their chips and parts. I'm guessing they want to make everything in-house, I don't know how that's gonna work out, but hey, they will probably pull through judging from past actions and events.
EDIT: I realized something, they don't want to keep everything in house, but rather, they don't want major competitors (such as samsung) making parts for them.
Click to expand...
Click to collapse
I don't think Intel is going to build an ARM processor instead focus heavily on power efficiency on their current low power processor variants. Remember that ultrabook will utilized various Core I processors
As for apple, as said many times before. They do not make anything in house.
ph00ny said:
I don't think Intel is going to build an ARM processor instead focus heavily on power efficiency on their current low power processor variants. Remember that ultrabook will utilized various Core I processors
As for apple, as said many times before. They do not make anything in house.
Click to expand...
Click to collapse
About the in house part yea I realized and corrected that with the edit. My bad. But for the ultrabooks, you're right, I forgot that they are going to use the Core i variants. But while you may be right and that intel might be releasing an ultra efficient atom architecture, I dunno, I still think that a move into the ARM market would be a smart move for them.

[Q] Most badass GPU and CPU in da world; Expert Knowedge please :)

I've been doing quite a bit of research on GPU's and CPU's in phone's/tablets lately. And I have a few unanswered questions that I can't seem to find an answer for.
1: What's the best chipset available for mobile phones and tablets right now? This link cleared quite a bit up for me, it does a fairly indepth comparison for both GPU and CPU performance between the Qualcomm S4, Tegra 3, OMAP 4470, and the Exynos 4212. And I dont want the 'Well this is better because it has more jiggahertz". Shut up, that's not what I need. I need something more indepth. If studies on individual GPU comparison can be provided, please drop a link. I'd like to know these things very well.
2: What individual GPU is currently the best? I realize the Ipad3 came out with with a graphics chip that's supposedly superior to the Xbox/PS3's. However I take anything Apple says with a grain of salt, they're notorious for shooting flaming BS out of their rear. However based on the little bit of searching I've done, the Adreno GPU's seem to be ahead of their time. I previously thought the Mali 400 GPU in the Exynos chipset was one of the best, but apparently it's outdated. Again, links to tests/studies/comparisons would be appreciated.
3: What's the deal with the ARM chips? Are the A5's, A6's, A11's, (and whatever other A chips out there are), some standard CPU developed by ARM and licensed out to all manufacturers to use in their chipsets?
4: What alternatives are there to the ARM CPU's? Most chipsets I research seem to be using a Cortex A9 chip.
5: What's the difference between the A5, A6, A9, etc. From what I've seen the higher numbers are the newer models, but I feel like that's a very shallow definition. If that is true, why does the newest iPad only use an A5x chip for it's quad core rather than an A9 or something of the sort.
6: Is the chipset in the iPad really the fastest out there? Personally, I can't really stand apple products; let alone the rabid fanboys and the obnoxious advertisements they put out. I can recognize that they very often gloat about their products and overexaggerate; like how they said the dual core in the iPhone 4s is the fastest out there, yet from what I've read the A5 is the worst performing dual core out there. Is the GPU in the tablet really superior to the Xbox? And is the processor really able to outdo the Tegra 3?
If you're able to answer any one of these, even exclusively, that would be appreciated. I just like knowledge
MultiLockOn said:
I've been doing quite a bit of research on GPU's and CPU's in phone's/tablets lately. And I have a few unanswered questions that I can't seem to find an answer for.
1: What's the best chipset available for mobile phones and tablets right now? This link cleared quite a bit up for me, it does a fairly indepth comparison for both GPU and CPU performance between the Qualcomm S4, Tegra 3, OMAP 4470, and the Exynos 4212. And I dont want the 'Well this is better because it has more jiggahertz". Shut up, that's not what I need. I need something more indepth. If studies on individual GPU comparison can be provided, please drop a link. I'd like to know these things very well.
2: What individual GPU is currently the best? I realize the Ipad3 came out with with a graphics chip that's supposedly superior to the Xbox/PS3's. However I take anything Apple says with a grain of salt, they're notorious for shooting flaming BS out of their rear. However based on the little bit of searching I've done, the Adreno GPU's seem to be ahead of their time. I previously thought the Mali 400 GPU in the Exynos chipset was one of the best, but apparently it's outdated. Again, links to tests/studies/comparisons would be appreciated.
3: What's the deal with the ARM chips? Are the A5's, A6's, A11's, (and whatever other A chips out there are), some standard CPU developed by ARM and licensed out to all manufacturers to use in their chipsets?
4: What alternatives are there to the ARM CPU's? Most chipsets I research seem to be using a Cortex A9 chip.
5: What's the difference between the A5, A6, A9, etc. From what I've seen the higher numbers are the newer models, but I feel like that's a very shallow definition. If that is true, why does the newest iPad only use an A5x chip for it's quad core rather than an A9 or something of the sort.
6: Is the chipset in the iPad really the fastest out there? Personally, I can't really stand apple products; let alone the rabid fanboys and the obnoxious advertisements they put out. I can recognize that they very often gloat about their products and overexaggerate; like how they said the dual core in the iPhone 4s is the fastest out there, yet from what I've read the A5 is the worst performing dual core out there. Is the GPU in the tablet really superior to the Xbox? And is the processor really able to outdo the Tegra 3?
If you're able to answer any one of these, even exclusively, that would be appreciated. I just like knowledge
Click to expand...
Click to collapse
1. Dunno right now, it's always changing. I hear the new Qualcomm processors with the new Andreno gpu are supposed to be the ****, but it's not out yet so who knows. The iPad 3 currently has not had any real world tests done yet, we need to wait for release. It is basically the same A5 chip as the iPad 2 but with the PSVita's gpu thrown in.
2. *sigh* The iPad 3 is not more powerful than an Xbox 360. It is better in I believe one aspect (more memory), but this has very little impact on performance/graphics quality. This is Apple shooting wads of **** out it's arse, or whoever made the claim. It's actually using the same GPU found in the PSVita, which we all know is not as powerful as a PS3/Xbox360. However, the PSVita is also using a quad core cpu, whereas the iPad 3 is using the same dual core A5 as the iPad 2, so technically the PSVita is superior. You also have to consider how many more pixels the gpu has to power on the iPad 3's display. While high res is nice, it takes more power to render it.
3. ARM creates a base chip for companies to slap their own GPU's and name on. The naming structure is pretty self explanatory.
4. All CPU's currently in tablets/cellphones are a variant of the ARM. A Cortex A9 is still an ARM chip. This will soon change when Intel releases their tablet/phone chips.
5. You're right, higher numbers do mean newer modeling. I don't know all the exacts, but with the newer ARM series you get higher and/or more efficient clocks, generally some battery savings, and in some series support for more cores. Apple's labeling of their chips has nothing to do with ARM's, it's their own naming scheme. The A5x is just what Apple calls their version of the ARM processor.
6. I believe atm the iPad 3 has the fastest chipset in a tablet..for now. It won't take long for it to be overtaken by other companies, there's so much in the works right now.
speedyink said:
1. Dunno right now, it's always changing. I hear the new Qualcomm processors with the new Andreno gpu are supposed to be the ****, but it's not out yet so who knows. The iPad 3 currently has not had any real world tests done yet, we need to wait for release. It is basically the same A5 chip as the iPad 2 but with the PSVita's gpu thrown in.
2. *sigh* The iPad 3 gpu is not more powerful than an Xbox 360. It is better in I believe one aspect (more memory), but this has very little impact on performance/graphics quality. This is Apple shooting wads of **** out it's arse, or whoever made the claim. It's actually using the same GPU found in the PSVita, which we all know is not as powerful as a PS3/Xbox360. However, the PSVita is also using a quad core cpu, whereas the iPad 3 is using the same dual core A5 as the iPad 2, so technically the PSVita is superior.
3. ARM creates a base chip for companies to slap their own GPU's and name on. The naming structure is pretty self explanatory.
4. All CPU's currently in tablets/cellphones are a variant of the ARM. A Cortex A9 is still an ARM chip. This will soon change when Intel releases their tablet/phone chips.
5. You're right, higher numbers do mean newer modeling. I don't know all the exacts, but with the newer ARM series you get higher and/or more efficient clocks, generally some battery savings, and in some series support for more cores. Apple's labeling of their chips has nothing to do with ARM's, it's their own naming scheme. The A5x is just what Apple calls their version of the ARM processor.
6. I believe atm the iPad 3 has the fastest chipset in a tablet..for now. It won't take long for it to be overtaken by other companies, there's so much in the works right now.
Click to expand...
Click to collapse
Thanks for the reply. It seems weird to me that Apple would rename a CPU to something as similar to one that would already exist, A5x as to A5.
MultiLockOn said:
Thanks for the reply. It seems weird to me that Apple would rename a CPU to something as similar to one that would already exist, A5x as to A5.
Click to expand...
Click to collapse
Because Apple is the type of company to step on someones feet like that, and then sue them later on for copyright infringement. Damn the confusion, Apple starts with A, so will their processors.
speedyink said:
Because Apple is the type of company to step on someones feet like that, and then sue them later on for copyright infringement. Damn the confusion, Apple starts with A, so will their processors.
Click to expand...
Click to collapse
yeah, apple just simply buy a technology and re-label them, make patent and troll others. so for comparison, apple doesn't count. Also these handheld chipset can't be compared with consoles, consoles have more proccessing power like more RAM bandwidth and polygons.
Anyway.. based on my experience, mali400 exynos has a butterly smooth performance for both UI and 3D graphics. I've tried both Gingerbread GNote and my SGS2.
on the other hand, Google did a great job with TI OMAP for it's Galaxy Nexus, pure HW accelerated 4.0.3.. with very little glitch, but I believe it's software issue.
IMO if you wanna buy a fast and smooth device, follow the current Nexus spec (at least similar) like GNexus, Motorola RAZR, etc. I've seen Tegra 3 4+1 Transformer Prime but never hands-on it. as far as i seen, UI and 3D performance are stunning. 1 extra core advantage is for low power mode when doing light proccessing and standby mode. Today hardwares are fast enough, drivers and OS optimisation are very important thing if you want everything run smoothly.
cmiiw, sorry for bad english
lesp4ul said:
yeah, apple just simply buy a technology and re-label them, make patent and troll others. so for comparison, apple doesn't count. Also these handheld chipset can't be compared with consoles, consoles have more proccessing power like more RAM bandwidth and polygons.
Anyway.. based on my experience, mali400 exynos has a butterly smooth performance for both UI and 3D graphics. I've tried both Gingerbread GNote and my SGS2.
on the other hand, Google did a great job with TI OMAP for it's Galaxy Nexus, pure HW accelerated 4.0.3.. with very little glitch, but I believe it's software issue.
IMO if you wanna buy a fast and smooth device, follow the current Nexus spec (at least similar) like GNexus, Motorola RAZR, etc. I've seen Tegra 3 4+1 Transformer Prime but never hands-on it. as far as i seen, UI and 3D performance are stunning. 1 extra core advantage is for low power mode when doing light proccessing and standby mode. Today hardwares are fast enough, drivers and OS optimisation are very important thing if you want everything run smoothly.
cmiiw, sorry for bad english
Click to expand...
Click to collapse
I kmow what you mean. Im extremely happy with my galaxy s2, I cant say I ever recall it lagging on me in any way whatsoever. Im not sure what makes the droid razr and galaxy nexus comparable to the s2. From what Ive read Omap processors tend to lag and consume battery, and the mali 400 is better than what either of those phones have. Id say its ICS but the razr still
Runs gingerbread
I was hoping for some more attention in here :/
I agree, omaps are battery hungry beast. Like my previous Optimus Black, man... i only got 12-14 hours with edge (1ghz UV smartass v2, also ****ty LG kernel haha). Same issue as my friend's Galaxy SL. I dunno if newer soc has a better behaviour.
Sent from my Nokia 6510 using PaperPlane™

Categories

Resources