Related
We've all been seeing all the latest devices by other manufacturers,like Moto's Droid X and Samsung's Galaxy S,that can perform better than our HTC devices with similar specs,using the Snapdragon SoC like the Desire in my case,or Evo 4G etc.I caught myself doing what many people do here and was ashamed of it.I was criticizing my device very strictly and almost got myself thinking that it's not good,almost trying and wanting to find problems in it.Yes,it doesn't make my coffee well p ),but wtf?It's a phone,not a damn supercomputer!
Then I came to think of it a little more seriously...Samsung and Moto brought out those devices more than a year after HTC,so they had more time to make the hardware better and more powerful.Now that Qualcomm has dual-core Snapdragons,the others have single-core processors.They seem to be in the lead again.Can we count on that?Will they change what we define as high-end again?Or are they what some call the worst choice available?Should HTC cooperate with another SoC manufacturer like Texas Instruments?
And something off-topic...Will the new Snapdragon 2 be based on Arm's Cortex A9 or A8?
Thanks in advance guys!And sorry if I am becoming a nuisance with all these questions!
Im just wondering when well see heatsinks and little fans for the processor on the back of our phones.
Yeah!Can you imagine carrying around a phone that has a water cooling system?
But no,newer phones are more power efficient,and less power consumption means less heat.No wonder my Desire doesn't get any hotter than my Hero!
I know i'm gonna get burned at the stake for this one, since this is a tech forum, but dual core is just overkill AT THE PRESENT MOMENT. It's like computers. They are all now dualcore, most come with almost 4 gigs of ram. What in the hell would 95% of the population need AT THE MOMENT with something more powerful than that? LIke a quadcore with 8 gigs? NOTHING. It's just a ploy to get more money. Our 1ghz phones can run everything just fine. This isn't like the early days of android where it always felt like more ram and raw power was needed. We have hit a plateau where the current cellphone landscape fits MOST peoples needs. Can i really be the only one who thinks that it's just unnecessary?
Remember, xda only represents .0000000001% of actual real world use. I am talking about the layman who is actually gonna fall for the "OMFG ITS GONNA DO EVERYTHING SO MUCH BETTER AND FASTER", um no it's not. Most people dont even max out there current hardware.
Edit: Seriously people get a grip on reality. I'm not pushing my views on anyone. It's a ****ing forum, you know, one of those places where people discuss things??? The debate that has come out of this has been fantastic, and i have learned alot of things i didnt know. I'm not gonna change my original post to not confuse people reading the whole topic, but i can now understand why dual core does make some sense. Quit attacking me and making stuff so personal, it's uncalled for and frankly i'm about to ask a mod to close this topic cause it's getting so ridiculous. Learn how to have a debate without letting all the emotion get in the way or GTFO. YOUR the one with the problem, not me.
Xda doesn't care. We like specs, maxing out our devices, and most of all, benchmarking
redbullcat said:
Xda doesn't care. We like specs, maxing out our devices, and most of all, benchmarking
Click to expand...
Click to collapse
Well as do i! I'm talking about the uneducated masses.
more cores mean;
more threads
meaning better apps
meaning better FPS
meaning HD everything
meaning more capabilities
meaning more fun with less devices.
Do you remember the days you had a cell phone, a PDA, an MP3 player, a digital camera AND a laptop? All that was missing is your bat symbol and cape. I like not having to have a utility belt of gadgets on my person.
I would rather see them work on battery saving and density technologies to eventually allow for one week [heavy usage] times.
iamnottypingthis said:
I would rather see them work on battery saving and density technologies to eventually allow for one week [heavy usage] times.
Click to expand...
Click to collapse
Hard for you to believe, i know, but that's what having a multi-core does, it helps improve battery life (both in standby and in usage). Sure it's not a definitive answer to our battery problems, but it's a first.
Hey Lude219, I thought I'd post this as I thought you gave a good explanation on battery life and usage (fifth one down).
It really all comes down to the person's requirements. If someone requires to run several apps at once, or requires to watch movies at a higher frame rate, or requires to have the 'best phone on the market', then they'll buy a dual-core phone, no-one else will care (much). Most people I talk to agree and think that Dual-Core in a phone is unnecessary ('dual-core phone' it even sounds ridiculous lol), but, I must admit that I was surprised at how laggy my DHD was out the packet, and don't get me wrong, I know once it's rooted it will be much better just because the SW is cleaner, but most people will not even contemplate rooting their phone, so if it's not an option for them, dual-core will surely help.
Dual-core procs don't have a higher power consumption than single-core procs (or at least they won't if they design/implement them properly), so it shouldn't (fingers crossed) make power consumption any worse.
Personally, I'd also rather they put they're time and effort into making better batteries and improving general power consumption.
It'll be the next marketing point after the dual-core hype has ebbed (Now with Three Days Standby!! YEY!!)
Well i think most people who do buy these "powerful" devices have one important reason to buy, and that is to future proof themselves. But ey, i'm looking at the perspective of a tech savy guy, I suppose the masses simply want the next best thing.
But you are right however, it is a ploy to make money, but everything in business is, so there's no difference between dual core, one core, 8 mp camera, 5 mp, 720p. 1080p, it's all business. If there was no business then.. well, where'd we get our smartphones?
lude219 said:
Hard for you to believe, i know, but that's what having a multi-core does, it helps improve battery life (both in standby and in usage). Sure it's not a definitive answer to our battery problems, but it's a first.
Click to expand...
Click to collapse
I can easily go into why you're wrong, but I won't waste the calories. Other things besides just adding a core are done to get those gains. If more cores equaled more power savings, ULV cpus would be octo-core.
Just a matter time when they get battery life ironed out in smartphones and to the OP i would agree in some aspect, but they are smartphones why not just keep improving them. Else if someone never thought outside box we would still stuck with dumb phones =no fun.
here a link for next gen snap dragons sounds promising.
I won't lie, right now dual core is overkill. But in time like everything else has computer wise, it will be the normal and will be the way all devices go, that's not just considering dual core. I'm talking pure multicore threading. It's not just the number of cores you're buying as well, it's the difference core to core when you compare say arm cortex a8 to the Tegra II's Arm Cortex a9, single core the a9 will be faster and more efficient and also produce less heat thanks to the die shrink, which then also means less power draw per core. Right now for phones, dual core is futureproofing a bit for when we do have android that is fully multithreaded, and apps that are as well.
There's also something you need to remember, XDA isn't really a big fraction of people using android devices and what not, but not every android user is on XDA. I also disagree with everyone maxing out their hardware, just running my Evo with a few of the aosp live wallpapers my evo runs terrible, and web browsing isn't the greatest either depending on the website.
Oh dude you should so post this one overclock.net, the beat down you would get would be hilarious. But anyway back one topic, as for phones, well for some people dual core is nice, for example me and my friends, when we head off to lecture, all we can do is browse the web on our phones, all of us, for some odd reason like to have at least 6-8 tabs open at the same time and for the phones we have (I have an iphone 3gs, theres a couple captivates, Droid Inc 2, and some others), they sometimes tend to slow down with all of the tabs open. Also when you open up numerous applications, you have to sometimes close out of some of them because the one that is open starts to slow down. Thats a couple reasons that dual core is nice, with massive multitasking. But with the computer part, where you say that no one needs a quad core processor, well think about it, there are a lot of people who want performance (not just XDA, theres overclock.net, techpowerup, EVGA, HardOCP, etc) and just random people who want fast computers for reasons such as video processing, gaming (this is probably a big reason), ridiculous multitasking (I fall into this category cause I have over 125 tabs open in chrome right now and I actually needed to upgrade to 8 gb's of ram because it was saying I was running out of ram with only 4), and some people that want just plain snappiness from their computer. So I would not say that a quad core processor is overkill for most people as the demographic I mentioned above does include a decent amount of people.
Oh and I forgot to mention watching Hi def videos, your average intel integrated graphics card cannot play a 1080p video without issues so thats why you might need a faster processor and a faster GPU to play those videos in an HTPC.
But yes for your average everyday joe, a simple nehalem based dual core would suffice for everyday tasks such as web browsing and such but it cannot do much else.
xsteven77x said:
I know i'm gonna get burned at the stake for this one, since this is a tech forum, but dual core is just overkill AT THE PRESENT MOMENT. It's like computers. They are all now dualcore, most come with almost 4 gigs of ram. What in the hell would 95% of the population need AT THE MOMENT with something more powerful than that? LIke a quadcore with 8 gigs? NOTHING. It's just a ploy to get more money.
Click to expand...
Click to collapse
Which is why netbooks took off for a while there (until people realized those were a bit too slow)
Our 1ghz phones can run everything just fine. This isn't like the early days of android where it always felt like more ram and raw power was needed. We have hit a plateau where the current cellphone landscape fits MOST peoples needs. Can i really be the only one who thinks that it's just unnecessary?
Click to expand...
Click to collapse
I completely disagree. The difference between dual and single core for mobile devices is *huge*. There is a *huge* difference between everything running "fine" and everything running "great". The biggest difference is for games and web browser, which most people absolutely care about. There is also the wide range of more powerful apps it enables, which for now is more important on the tablet, but that will come to phones as well.
Dual core is not overkill, for one, its future proofing your phone, most ppl buy the phones on contract and in a couple of months dual cores will be the standard for high end smartphones, second, it allows for better GPU performance which leads to better games and overall experience, there are many benefits to it, too many for me to list...
iamnottypingthis said:
I can easily go into why you're wrong, but I won't waste the calories. Other things besides just adding a core are done to get those gains. If more cores equaled more power savings, ULV cpus would be octo-core.
Click to expand...
Click to collapse
Yea, it's better if you don't, because I dont think you have any substantial knowledge on the matter to go against the research and knowledge of all the computer engineers out there. The reason why it's not octo-cores yet is because it's called a BUSINESS. But I wont waste the calories in telling you why that is until you go and read up on "economy of scales."
It'll be interesting at least to see what develops. See if they'll start doing proper separate GPU Die's or if they'll dedicate GPU cores on the proc (i.e quad core chip with 2 CPU cores and 2 GPU cores).
Hope people don't start to get burnt when they begin maxing out/overclocking their cores.
Funny, if you stop developing you get nothing because you are satisfied with nothing.
Us at XDA are techies and you give us more core more ram more battery we will figure what to create with the new abilities. That is how progress is done.
As far as the masses, let marketing depts do their thing to them........we do not care, never did. As for me, I have a 12 core motherboard with 32 gigs of ram.etc and I jack it to 85% demand almost every day, and I am sure that there are very very few computers that have this capabilities.
The funny thing more innovation make more efficiencies my computer under a full load uses less than most of the gaming rigs out there and has 50% more muscle.
On the phone dual core allow one to create algorithms that will make the battery use way more efficient.
More cores more ram === win win win for everyone, but us in XDA and other forums like this it is just great great great for us.......... don't worry we will use what ever is created 110% and make it better.
If dual core in your Nokia 3210, yes it's overkilling, but if dual core in your cad workstation, it's been overkilled. All depends on the user, usage, and design of the device.
Actually it's an arueable question whether dual-core cpus are an overkill today, they have several advantages but most of those can be applied to netbooks and tablets rather than phones.
1. When there are several CPUs, multi-threaded applications can be really run concurrently (and basically, even if one application is performing, the scheduling overhead for multi-core system is lower and background tasks like gui/hardware drivers can be executed on a separate core).
2. Another use case (although this is a misuse and abuse of CPU anyway) is the use of multi-core systems for encoding/decoding media. It brings absolutely no advantages to the end user, but when the CPU is powerful enough to handle the media stream, one may use it instead of a proper DSP processor which Google will likely be doing for VP8/WebM
3. SMPs can be useful in tablets and netbooks - for example, tegra2 will outperform intel atom in most cases (first of all, it is dual-core. and secondly, it has a very powerful GPU). I am personally using debian on my tablet (in chroot though) and many people are using ubuntu on toshiba ac100 - arm SoCs are a fun to hack and give an incredible battery life. But this is IMHO only acceptable for geeks like us and I think dual-core (or x-whatever-core) ARM CPUs will be useful for consumers (hate this word but whatever) if some vendor releases a device which will run a full-fledged linux distro with LibreOffice, math packages like octave/maxima, development environments like kdevelop so that it can be used as an equal replacement of an x86 netbook.
As for the popular arguement about power consumption - surprisingly, but there is little correlation between the number of cores and power drain. Newer SoCs are more energy efficient because they have improvements in technical process (literally the length of wires inside the chip), more devices are integrated into one chip, more processing blocks can be put to sleep states. Even if you compare a qualcomm qsd8250 running at 1GHz with a GPU enabled, it will use less power than an old 520 MHz intel pxa270. Besides, as I have already mentioned, a multiprocessor system can execute tasks concurrently which means that the computation will take less time and the processor will spend more time in a power-saving state.
Basically multi-cores are a popular trend and is a good way to make consumers pay for new toys. For me personally the reasons to change a device have always been either the age of the device (when it literally began to fall apart) or the real improvements in hardware (I updated from Asus P525 to Xperia X1 because ever since I had my first pda I was frustrated by the tiny 32 or 64 mb ram and awful screens with large pixels that were really causing pain in eyes if one used them for long) but unfortunately the situation now is the same as it is in the desktop world - software quality is getting worse even faster than hardware improves. Hence we see crap like java and other managed code on PDAs and applications that require like 10 Mb ram to perform simple functions (which were like 100 Kb back in winmo days). I do admit that using more ram can allow to use more efficient algorithms (to reduce their computational complexity) and managed code allows for higher portability - but hey, we know that commercial software is not developed with the ideas of efficiency in mind - the only things corporations care about are writing the application as quick as possible and hide the source code.
lude219 said:
Yea, it's better if you don't, because I dont think you have any substantial knowledge on the matter to go against the research and knowledge of all the computer engineers out there. The reason why it's not octo-cores yet is because it's called a BUSINESS. But I wont waste the calories in telling you why that is until you go and read up on "economy of scales."
Click to expand...
Click to collapse
That and yields for Nehalem 8 cores aren't so high. Bulldozer yields are working out okay so far, but then again it's not a real 8 core cpu...
With all the advancements made in the processor such as the Exynos and the new 32nM chips why is there not any 2GB of RAM phones? You could easily tone down the processor to 1.2 dual core and the RAM would take a lot of the work. And the phone would be way fast.
On the same note why not empower the GPU to do more work and make it 1.2Ghz dual core as well. That would really light a fire on graphics rendering and also make using something like the Ubuntu on Android while in a dock work so much better.
What does the community here think about this? Let the ideas flow.
jaed.43725 said:
With all the advancements made in the processor such as the Exynos and the new 32nM chips why is there not any 2GB of RAM phones? You could easily tone down the processor to 1.2 dual core and the RAM would take a lot of the work. And the phone would be way fast.
On the same note why not empower the GPU to do more work and make it 1.2Ghz dual core as well. That would really light a fire on graphics rendering and also make using something like the Ubuntu on Android while in a dock work so much better.
What does the community here think about this? Let the ideas flow.
Click to expand...
Click to collapse
I completely agree. I think the average user is just starting to learn about CPU and cores. Once the average person hears the speed and number of cores they immediately jump to conclusions like "Wow thats fast!" or "Thats slow". If people throw in the other terms I think the average person would be turned off by the phone. Basically like the Mac is to people who are not really good with computers, virtually maintenance free.
If that were to happen we would get faster phones, but is the technology there to do what you said. Do we have small enough batteries to power these devices for at least a day? I think America is more stuck on the idea of higher data speeds than the speed of the phones, cause hey America is addicted to Facebook!
Yes it would seem that America has been dumbed down by iEverything. If people would just realize that a faster phone equates to a faster network then there would be more push.
The technology is there to both make it and power it. However the buyer well need to snap out of the trance they are in that the phone must be as thin as possible. The average and even techie user will not give any notice between 11mm and 16mm to put in a 3000mah battery.
Sent from my SCH-I500 using XDA App
.
Thread moved to Q&A due to it being a question. Would advise you to read forum rules and post in correct section.
Failure to comply with forum rules will result in an infraction and/or ban depending on severity of rule break.
I wasn't entirely sure of where it would have went, since its an opinionated question and not a tech question. Thank you for the clarification.
i think
I think there is a problem in the technology of manufacturing
Dedokrus said:
I think there is a problem in the technology of manufacturing
Click to expand...
Click to collapse
Dedokrus: Could explain that?
As I far as I can see, if there is a 32nM quad-core CPU running 1.5G then 2G of RAM and more GPU is well within the realm of current manufacturing.
Does anyone have any insight into this, like someone who has been a part of phone development?
I've been doing quite a bit of research on GPU's and CPU's in phone's/tablets lately. And I have a few unanswered questions that I can't seem to find an answer for.
1: What's the best chipset available for mobile phones and tablets right now? This link cleared quite a bit up for me, it does a fairly indepth comparison for both GPU and CPU performance between the Qualcomm S4, Tegra 3, OMAP 4470, and the Exynos 4212. And I dont want the 'Well this is better because it has more jiggahertz". Shut up, that's not what I need. I need something more indepth. If studies on individual GPU comparison can be provided, please drop a link. I'd like to know these things very well.
2: What individual GPU is currently the best? I realize the Ipad3 came out with with a graphics chip that's supposedly superior to the Xbox/PS3's. However I take anything Apple says with a grain of salt, they're notorious for shooting flaming BS out of their rear. However based on the little bit of searching I've done, the Adreno GPU's seem to be ahead of their time. I previously thought the Mali 400 GPU in the Exynos chipset was one of the best, but apparently it's outdated. Again, links to tests/studies/comparisons would be appreciated.
3: What's the deal with the ARM chips? Are the A5's, A6's, A11's, (and whatever other A chips out there are), some standard CPU developed by ARM and licensed out to all manufacturers to use in their chipsets?
4: What alternatives are there to the ARM CPU's? Most chipsets I research seem to be using a Cortex A9 chip.
5: What's the difference between the A5, A6, A9, etc. From what I've seen the higher numbers are the newer models, but I feel like that's a very shallow definition. If that is true, why does the newest iPad only use an A5x chip for it's quad core rather than an A9 or something of the sort.
6: Is the chipset in the iPad really the fastest out there? Personally, I can't really stand apple products; let alone the rabid fanboys and the obnoxious advertisements they put out. I can recognize that they very often gloat about their products and overexaggerate; like how they said the dual core in the iPhone 4s is the fastest out there, yet from what I've read the A5 is the worst performing dual core out there. Is the GPU in the tablet really superior to the Xbox? And is the processor really able to outdo the Tegra 3?
If you're able to answer any one of these, even exclusively, that would be appreciated. I just like knowledge
MultiLockOn said:
I've been doing quite a bit of research on GPU's and CPU's in phone's/tablets lately. And I have a few unanswered questions that I can't seem to find an answer for.
1: What's the best chipset available for mobile phones and tablets right now? This link cleared quite a bit up for me, it does a fairly indepth comparison for both GPU and CPU performance between the Qualcomm S4, Tegra 3, OMAP 4470, and the Exynos 4212. And I dont want the 'Well this is better because it has more jiggahertz". Shut up, that's not what I need. I need something more indepth. If studies on individual GPU comparison can be provided, please drop a link. I'd like to know these things very well.
2: What individual GPU is currently the best? I realize the Ipad3 came out with with a graphics chip that's supposedly superior to the Xbox/PS3's. However I take anything Apple says with a grain of salt, they're notorious for shooting flaming BS out of their rear. However based on the little bit of searching I've done, the Adreno GPU's seem to be ahead of their time. I previously thought the Mali 400 GPU in the Exynos chipset was one of the best, but apparently it's outdated. Again, links to tests/studies/comparisons would be appreciated.
3: What's the deal with the ARM chips? Are the A5's, A6's, A11's, (and whatever other A chips out there are), some standard CPU developed by ARM and licensed out to all manufacturers to use in their chipsets?
4: What alternatives are there to the ARM CPU's? Most chipsets I research seem to be using a Cortex A9 chip.
5: What's the difference between the A5, A6, A9, etc. From what I've seen the higher numbers are the newer models, but I feel like that's a very shallow definition. If that is true, why does the newest iPad only use an A5x chip for it's quad core rather than an A9 or something of the sort.
6: Is the chipset in the iPad really the fastest out there? Personally, I can't really stand apple products; let alone the rabid fanboys and the obnoxious advertisements they put out. I can recognize that they very often gloat about their products and overexaggerate; like how they said the dual core in the iPhone 4s is the fastest out there, yet from what I've read the A5 is the worst performing dual core out there. Is the GPU in the tablet really superior to the Xbox? And is the processor really able to outdo the Tegra 3?
If you're able to answer any one of these, even exclusively, that would be appreciated. I just like knowledge
Click to expand...
Click to collapse
1. Dunno right now, it's always changing. I hear the new Qualcomm processors with the new Andreno gpu are supposed to be the ****, but it's not out yet so who knows. The iPad 3 currently has not had any real world tests done yet, we need to wait for release. It is basically the same A5 chip as the iPad 2 but with the PSVita's gpu thrown in.
2. *sigh* The iPad 3 is not more powerful than an Xbox 360. It is better in I believe one aspect (more memory), but this has very little impact on performance/graphics quality. This is Apple shooting wads of **** out it's arse, or whoever made the claim. It's actually using the same GPU found in the PSVita, which we all know is not as powerful as a PS3/Xbox360. However, the PSVita is also using a quad core cpu, whereas the iPad 3 is using the same dual core A5 as the iPad 2, so technically the PSVita is superior. You also have to consider how many more pixels the gpu has to power on the iPad 3's display. While high res is nice, it takes more power to render it.
3. ARM creates a base chip for companies to slap their own GPU's and name on. The naming structure is pretty self explanatory.
4. All CPU's currently in tablets/cellphones are a variant of the ARM. A Cortex A9 is still an ARM chip. This will soon change when Intel releases their tablet/phone chips.
5. You're right, higher numbers do mean newer modeling. I don't know all the exacts, but with the newer ARM series you get higher and/or more efficient clocks, generally some battery savings, and in some series support for more cores. Apple's labeling of their chips has nothing to do with ARM's, it's their own naming scheme. The A5x is just what Apple calls their version of the ARM processor.
6. I believe atm the iPad 3 has the fastest chipset in a tablet..for now. It won't take long for it to be overtaken by other companies, there's so much in the works right now.
speedyink said:
1. Dunno right now, it's always changing. I hear the new Qualcomm processors with the new Andreno gpu are supposed to be the ****, but it's not out yet so who knows. The iPad 3 currently has not had any real world tests done yet, we need to wait for release. It is basically the same A5 chip as the iPad 2 but with the PSVita's gpu thrown in.
2. *sigh* The iPad 3 gpu is not more powerful than an Xbox 360. It is better in I believe one aspect (more memory), but this has very little impact on performance/graphics quality. This is Apple shooting wads of **** out it's arse, or whoever made the claim. It's actually using the same GPU found in the PSVita, which we all know is not as powerful as a PS3/Xbox360. However, the PSVita is also using a quad core cpu, whereas the iPad 3 is using the same dual core A5 as the iPad 2, so technically the PSVita is superior.
3. ARM creates a base chip for companies to slap their own GPU's and name on. The naming structure is pretty self explanatory.
4. All CPU's currently in tablets/cellphones are a variant of the ARM. A Cortex A9 is still an ARM chip. This will soon change when Intel releases their tablet/phone chips.
5. You're right, higher numbers do mean newer modeling. I don't know all the exacts, but with the newer ARM series you get higher and/or more efficient clocks, generally some battery savings, and in some series support for more cores. Apple's labeling of their chips has nothing to do with ARM's, it's their own naming scheme. The A5x is just what Apple calls their version of the ARM processor.
6. I believe atm the iPad 3 has the fastest chipset in a tablet..for now. It won't take long for it to be overtaken by other companies, there's so much in the works right now.
Click to expand...
Click to collapse
Thanks for the reply. It seems weird to me that Apple would rename a CPU to something as similar to one that would already exist, A5x as to A5.
MultiLockOn said:
Thanks for the reply. It seems weird to me that Apple would rename a CPU to something as similar to one that would already exist, A5x as to A5.
Click to expand...
Click to collapse
Because Apple is the type of company to step on someones feet like that, and then sue them later on for copyright infringement. Damn the confusion, Apple starts with A, so will their processors.
speedyink said:
Because Apple is the type of company to step on someones feet like that, and then sue them later on for copyright infringement. Damn the confusion, Apple starts with A, so will their processors.
Click to expand...
Click to collapse
yeah, apple just simply buy a technology and re-label them, make patent and troll others. so for comparison, apple doesn't count. Also these handheld chipset can't be compared with consoles, consoles have more proccessing power like more RAM bandwidth and polygons.
Anyway.. based on my experience, mali400 exynos has a butterly smooth performance for both UI and 3D graphics. I've tried both Gingerbread GNote and my SGS2.
on the other hand, Google did a great job with TI OMAP for it's Galaxy Nexus, pure HW accelerated 4.0.3.. with very little glitch, but I believe it's software issue.
IMO if you wanna buy a fast and smooth device, follow the current Nexus spec (at least similar) like GNexus, Motorola RAZR, etc. I've seen Tegra 3 4+1 Transformer Prime but never hands-on it. as far as i seen, UI and 3D performance are stunning. 1 extra core advantage is for low power mode when doing light proccessing and standby mode. Today hardwares are fast enough, drivers and OS optimisation are very important thing if you want everything run smoothly.
cmiiw, sorry for bad english
lesp4ul said:
yeah, apple just simply buy a technology and re-label them, make patent and troll others. so for comparison, apple doesn't count. Also these handheld chipset can't be compared with consoles, consoles have more proccessing power like more RAM bandwidth and polygons.
Anyway.. based on my experience, mali400 exynos has a butterly smooth performance for both UI and 3D graphics. I've tried both Gingerbread GNote and my SGS2.
on the other hand, Google did a great job with TI OMAP for it's Galaxy Nexus, pure HW accelerated 4.0.3.. with very little glitch, but I believe it's software issue.
IMO if you wanna buy a fast and smooth device, follow the current Nexus spec (at least similar) like GNexus, Motorola RAZR, etc. I've seen Tegra 3 4+1 Transformer Prime but never hands-on it. as far as i seen, UI and 3D performance are stunning. 1 extra core advantage is for low power mode when doing light proccessing and standby mode. Today hardwares are fast enough, drivers and OS optimisation are very important thing if you want everything run smoothly.
cmiiw, sorry for bad english
Click to expand...
Click to collapse
I kmow what you mean. Im extremely happy with my galaxy s2, I cant say I ever recall it lagging on me in any way whatsoever. Im not sure what makes the droid razr and galaxy nexus comparable to the s2. From what Ive read Omap processors tend to lag and consume battery, and the mali 400 is better than what either of those phones have. Id say its ICS but the razr still
Runs gingerbread
I was hoping for some more attention in here :/
I agree, omaps are battery hungry beast. Like my previous Optimus Black, man... i only got 12-14 hours with edge (1ghz UV smartass v2, also ****ty LG kernel haha). Same issue as my friend's Galaxy SL. I dunno if newer soc has a better behaviour.
Sent from my Nokia 6510 using PaperPlane™
about exynos octa and future of exynos and exynos based mobile devices.
Over the past few months there have been a lot of talks going on about the system on a chip (SoC) choices that Samsung’s Mobile division has made for their recent and upcoming products; now new information has shed more light as to how these choices came to be, their reasoning, and what the repercussions for both the company and their users are.
Back in January, during the official announcement CES 2013, and as early as several weeks earlier as the internet rumour-mill and Korean analysts predicted that the Galaxy S4 would come with the company’s own Exynos Octa SoC, most people were not be aware what is going on behind the scenes at Samsung.
Samsung’s System LSI Business, a business unit of the company’s Semiconductor division, has had several SoC projects in their pipeline. We know for certain of the 5410 (Octa), the 5450 and 5440, both quad-core A15 chips that seem to never have been released. Other rumoured projects were the 5210, a supposed big.LITTLE chip in a 2 + 2 configuration.
Something went on during last winter that panicked the mobile division to change SoC provider for many variants of the S4. Mounting evidence of this can be found in the overlapping local-specific variants in the official source code of both Qualcomm and Exynos platforms. JF based variants, which are based on the Snapdragon chips, overlapped JA based variants, based on the Exynos platform. Korean (jf_ktt, jf_lgt, jf_skt <> jaltektt, jaltelgt, jalteskt), European (jf_eur <> jalte), and Japanese (jf_dcm <> jaltedcm) variants were developed for both platforms. The Korean variants ended up being the sole ones to actually hit the market with the Exynos platform, other than the global 3G version of which there’s no evident Qualcomm counterpart.
Chaotic development for the whole phone seems to have been the norm: Many parts changed supplier in prototype revisions, such as the Amtel touchscreen controller which gave way to a Synaptics counterpart, a MagnaChip AMOLED controller which is missing in action, and Philips LED controller which was shelved for a Texas Instruments IC. All of the prior seem like last-minute changes simply for the fact that their drivers and firmwares are delivered in the shipping product, even if they’re not used. A well-planned product is certainly something you would not call the S4.
One of the early reported reasons for the processor change was the unexpected power consumption of the Exynos. While this remains partly true and undoubtedly had an impact, the other reasons are far more sinister.
As a reminder, the Exynos 5410, as a big.LITTLE chip by design, is supposed to have different kinds of operating modes, mostly something that is defined and limited only by software:
- Cluster-migration; where only either one or the other quad-core clusters works at any one time.
- Core-migration; where both clusters can work in tandem but only have up to 4 physical CPUs online, but any mix of A7 and A15 cores can be achieved.
- Heterogeneous multiprocessing: All 8 cores are online at once.
The problem is that to achieve any of the latter two operating modes, a specific piece of hardware is needed that allows efficient and useful use of those models: the Cache-Coherrent-Interconect (CCI). As per ARM’s own claims: “Hardware coherency with CoreLink CCI-400 is a fundamental part of ARM big.LITTLE processing.”
While it has been obvious for several months that the person behind the SamsungExynos twitter account is nothing but a clueless PR representative, the above claim is nothing short of a lie.
We have information from several sources that Exynos’s CCI is inherently crippled in silicon. It is not functional or even powered on in the shipping product (i9500). In fact, this has been such of an issue, that as a result, the chip was almost cancelled. It was reportedly only salvaged by having it work in the cluster migration policy and bypass the CCI entirely. While contradicting, it questions the validity of ARM’s own videos while demonstrating the Octa.
Internally at SLSI, as many as three projects were cancelled late last year. We don’t know the reasons for their cancellation, however it is said that the issues are related, and unacceptable power consumption also plays a big role.
One can argue that ARM’s Cortex A15 is partly to blame here: The inherent architecture is to power consuming to be implemented in a smart-phone. big.LITTLE provides major breathing-room, but only in scenarios where continuous load is not an issue. HD gaming is a major Achilles heel where power consumption can run rampant. Nvidia is having it much worse with their Tegra 4: With only a single tablet design win besides their own Shield gaming console, it’s a chip that needs to, and will be, quickly forgotten.
Plagued by delays, hardware bugs, and high power consumption, one could view the Exynos 5410 as nothing short of a failure. In fact, Samsung’s Mobile division was so dismayed at the whole situation that their next major products will completely forego the company’s own Exynos chips and go straight with Qualcomm’s offerings.
Reports that the Note 3 would come with the S800 match with this information, and are probably very correct.
With confirmed designs such as the Galaxy Tab 3 coming with an Intel processor, and the rest of the new Galaxy line-up shipping with various variants of Qualcomm’s Snapdragon S-series, the mobile division should be lauded for providing the user with the best possible experience – even if that involves skipping the Exynos. They have proven that they have no qualms to use a wide array of third-party suppliers (ST-Ericsson, Intel, Broadcom, Qualcomm) to base their products on, and this strategy is proven to be successful.
As for SLSI, things look very bleak for Samsung’s in-house processors. The business is failing to cater, not only in terms of support, like providing proper hardware documentation and source code to the public, but the current line-up is in shambles also in terms of hardware.
Lackluster graphics performance and outdated GPUs have become sort of a habit for the company. This reportedly is due to an unwillingness to spend the money on IP licenses from third-party companies, and the use of Mali GPUs in their SoCs is due to a free licensing agreement they receive from ARM as a lead partner. The surprise use of the SGX 544MP3 in the Exynos 5410 is due to panic caused by Mali’s own T6xx GPUs: again an issue of extremely excessive power consumption. The first generation Midgard lineup was quickly scrapped, leaving the Exynos 5250 and its T604 as something of an orphan. Products like the T658 never saw the light of day and are not even mentioned anymore on ARM’s website.
Meanwhile, while their shipping products are failing to properly compete, Samsung is spending a lot of money on developing their own GPU IP from scratch. Not much information is available as to when we will see this in actual products, but it will eventually come, if not cancelled or delayed due to its unorthodox implementation of an FPGA-like re-programmable design which might be hit-or-miss. Imagination’s Rogue architecture and years of experience as a leading GPU IP provider will be tough competition.
CPU wise, things look just as bleak. Qualcomm currently dominates the performance per Watt scale for the high-end with the newest Krait architectures. With no custom design in the works, as done by Apple or Qualcomm, and no A57 or A53 as architectural refreshes from ARM, nor a new 20nm manufacturing process coming until 2014, the Exynos A15 line-up looks incapable of competing in the near-future.
(Disclaimer- sammobile)
the most shocking thing about this article is it comes from one of the most reliable site about anything and everything related to Samsun mobile division!!!
So you did not write this yourself (from the sounds of the last line you wrote), but you don't provide a link to the original source or credit beyond a veiled reference to sammobile?
Let me guess, so Exynos Octa is a fail product? :laugh:
windozeanti said:
Let me guess, so Exynos Octa is a fail product? :laugh:
Click to expand...
Click to collapse
well its not exactly a fail product, cause as the article and research said, the final product has not got enough homework as it deserved.
im pretty sure samsung would fix this in Note 3 and i highly doubt that samsung would introduce Note 3 without Exynos processor. it will have Snapdragon 800 (i wish) but it will also have exynos version. samsung would not leave its own chip just like that!
windozeanti said:
Let me guess, so Exynos Octa is a fail product? :laugh:
Click to expand...
Click to collapse
It runs the fastest mobile phone ever, it just never reached it's full potential, you know topping out round 2.3GHz, proper core migration for efficiency and total smoothness, stronger GPU. Then again, it's still the industry leader, but not without issues, so I guess Sammy will work on it, possibly bringing it over 20nm.
OMG !! after reading i want to sell my i9500
this device is 50-50 ... 50% updates 50% legacy !
if this will be considered a fail project i will break this into two !!!! rawr ~~~ im so dammnn dead !
gdonanthony said:
OMG !! after reading i want to sell my i9500
this device is 50-50 ... 50% updates 50% legacy !
if this will be considered a fail project i will break this into two !!!! rawr ~~~ im so dammnn dead !
Click to expand...
Click to collapse
come on guys dont be like this! u should be really proud of ur S4! blunder from samsung aside, why dont u feel pride on ur S4 that being a bit not what it should have been it still is the fastest smartphone on the market right now and that will remain until Note 3 comes out...!
after reading what uve posted i feel my S4 i9500 suxxx bigtime
Wow,
I'm happy i got the I9505
This post is in compliance with the national potato safety regulation.
[Galaxy S 4 LTE]
but still i think andrie will not give up on this big.LITTLE arch though,,i hope so XD
Why is this thread 60% of whining while only 40% of problem solving? I'll quote what I said in another thread.
Tears for Fears said:
Why are you guys whining? I9500 is *THE* fastest smartphone in business. This flaw only means lesser battery but the battery is still very sufficient. GSMArena rated it 65h. To put this in perspective, Sony Xperia Z and HTC One got 48h. Galaxy S III got 50h, iPhone 5 got 51h, LG Optimus G Pro got 50h and Nokia Lumia 920 got 44h. Why are you whining when you get 65h? That's A LOT more than the flagships of other major manufacturers! Yes, if big.LITTLE would be implemented correctly, you'd get maybe 70-75h rating but in my opinion, 65h is HUGE compared to others' 45-50h.
Really, stop complaining and enjoy having the most powerful smartphone on Earth with an amazing battery life!
Click to expand...
Click to collapse
Now can we please stop whining? It does no good to this thread.
Tears for Fears said:
Why is this thread 60% of whining while only 40% of problem solving? I'll quote what I said in another thread.
Now can we please stop whining? It does no good to this thread.
Click to expand...
Click to collapse
I see your intentions
But would you let these companies screw your hard earned money?
At least let corporations work hard at providing us a product worth every cent.
Samsung did a terrible job at rushing things with the device. They have sacrificed a huge margin of time trying to implement protocols which they knew in the first place would have a risk of being feasible for the entire market.
So now they defray their failures to us by selling a partially finished product with huge non-uniform variants of the same device name then make all these crazy impulsive changes then run away with our money then denying most of the issues that are clearly present.
They even don't document their own chip pretty well neither.
So we just got screwed by samsung.
Better rick roll ourselves at most.
This post is in compliance with the national potato safety regulation.
[Galaxy S 4 LTE]
aami.aami said:
Something went on during last winter that panicked the mobile division to change SoC provider for many variants of the S4. Mounting evidence of this can be found in the overlapping local-specific variants in the official source code of both Qualcomm and Exynos platforms.
Click to expand...
Click to collapse
Or, as is typical in the development stage of many products, the s/w coding got ahead of the decisions on h/w so they coded for both eventualities. Since none of us work for Samsung any assumption of "panic" is a wild ass guess. Both Octa and S-600 don't come with an in-built baseband. S-600 already has the necessary interfaces and drivers while they'd have to be written for Octa. Maybe that work couldn't have been done in time considering all the different LTE markets that would need to be coded for. That, or it was just simpler and easier to use S-600 in LTE markets. They used Octa/LTE in Korea, their home town market, so I highly doubt they did that if it wasn't considered their "premium" offering. Last year for the SGS3, shipments were 1/3 LTE and 2/3 HSPA. This year it's the reverse. So favoring S-600 over Octa could be purely a market driven decision based on the larger availability of LTE this year. I agree something could have gone wrong that caused a last minute shift to S-600 and away from Octa but some stray code found in the kernel isn't a smoking gun.
Chaotic development for the whole phone seems to have been the norm: Many parts changed supplier in prototype revisions, such as the Amtel touchscreen controller which gave way to a Synaptics counterpart, a MagnaChip AMOLED controller which is missing in action, and Philips LED controller which was shelved for a Texas Instruments IC. All of the prior seem like last-minute changes simply for the fact that their drivers and firmwares are delivered in the shipping product, even if they’re not used. A well-planned product is certainly something you would not call the S4.
Click to expand...
Click to collapse
Has anyone here run a large-scale product development group? Or worked on a h/w or s/w project that's the scope of launching a major electronic device? It's always chaos. Especially the last mile. The SGS4 could have been more chaotic than other devices Samsung's released but since none of us were on the project teams we're just guessing that last minute decisions were made because of crisis and error.
The problem is that to achieve any of the latter two operating modes, a specific piece of hardware is needed that allows efficient and useful use of those models: the Cache-Coherrent-Interconect (CCI). As per ARM’s own claims: “Hardware coherency with CoreLink CCI-400 is a fundamental part of ARM big.LITTLE processing.”
We have information from several sources that Exynos’s CCI is inherently crippled in silicon. It is not functional or even powered on in the shipping product (i9500). In fact, this has been such of an issue, that as a result, the chip was almost cancelled. It was reportedly only salvaged by having it work in the cluster migration policy and bypass the CCI entirely. While contradicting, it questions the validity of ARM’s own videos while demonstrating the Octa.
Click to expand...
Click to collapse
This is the way big.LITTLE is supposed to work. I'd like to understand what's missing and what replaced it.
As a reminder, the Exynos 5410, as a big.LITTLE chip by design, is supposed to have different kinds of operating modes, mostly something that is defined and limited only by software:
- Cluster-migration; where only either one or the other quad-core clusters works at any one time.
- Core-migration; where both clusters can work in tandem but only have up to 4 physical CPUs online, but any mix of A7 and A15 cores can be achieved.
- Heterogeneous multiprocessing: All 8 cores are online at once.
Click to expand...
Click to collapse
There are only two operating models. Why is Samsung choosing to implement one over the other "screwing" anyone? The dev community not having a toy to play with doesn't mean Octa's a commercial failure because 99.9% of i9500 owners don't give a crap that it could be operating in two modes but instead is only operating in one.
Plagued by delays, hardware bugs, and high power consumption, one could view the Exynos 5410 as nothing short of a failure. In fact, Samsung’s Mobile division was so dismayed at the whole situation that their next major products will completely forego the company’s own Exynos chips and go straight with Qualcomm’s offerings. Reports that the Note 3 would come with the S800 match with this information, and are probably very correct. With confirmed designs such as the Galaxy Tab 3 coming with an Intel processor, and the rest of the new Galaxy line-up shipping with various variants of Qualcomm’s Snapdragon S-series, the mobile division should be lauded for providing the user with the best possible experience – even if that involves skipping the Exynos. They have proven that they have no qualms to use a wide array of third-party suppliers (ST-Ericsson, Intel, Broadcom, Qualcomm) to base their products on, and this strategy is proven to be successful.
Click to expand...
Click to collapse
Does anyone here know what the production goals are for Octa? And over what time frame? Without knowing that how could 1/3 of SGS4's shipping with Octa be considered a failure? Qualcomm licenses ARM technology as the basis for all their designs. So the chip after S-800 will be big.LITTLE also as ARM's spent two years developing it and spent several billion dollars to do so. Exynos 5410 is the first and only commercial implementation of big.LITTLE. I highly doubt it's perfect as V1 of anything rarely is. But all this speculation of failure and use of adjectives like "chaos" is based on a very narrow and shortsighted view of what's basically a long view game.
Here are the white papers on big.LITTLE; three from ARM and one from Samsung. The Samsung paper's been authored by their Principal Engineer who has a PhD. When someone here who's got a PhD that’s being paid $500K+ a year and responsible for the s/w running hundreds of millions of devices writes a similar paper with as much fact and detail in it showing why big.LITTLE's a failure I'll be all ears. Until them I'm hoping the N3 has Octa because (at least to me) the long term benefit of having a big.LITTLE equipped device outweighs the short-term gains of going with Qualcomm's interpretation of a five-year old ARM design. For a site that's supposed to be technical you guys sure have a sour view on new and emerging technologies.
http://www.arm.com/files/downloads/big_LITTLE_Final_Final.pdf
http://www.arm.com/files/pdf/Advances_in_big.LITTLE_Technology_for_Power_and_Energy_Savings.pdf
http://www.arm.com/files/downloads/Software_Techniques_for_ARM_big.LITTLE_Systems.pdf
http://www.arm.com/files/downloads/Benefits_of_the_big.LITTLE_architecture.pdf
How would we feel barry with our 9500s without you barry.
Seriously, as a consumer, all i want is to know that i have the best and fastest smartphone when i traded my hard earned cash for it. As long as i know that i can run my desired apps faster than any other device out there, i know that i have gotten my money's worth. I couldnt care less if samsung hasnt done enough homework to utilize the big little technology at its fullest.
So please lets stop with the whining and complaining about the 9500 because its not helping out those who have gotten it. All of this to me feels like all bitterness. No punn intended. Just my thoughts. Thanks!
I knew that something little will be wrong with the octa, since it is the first time release, I think that all of these issues will be corrected by Samsung surely. I will stick with my GT-I9100G (beast-of-a-smartphone BTW is giving fight nowadays) until a new octa-core with those issues fixed comes up, may be the note 3
Sent from my Galaxy S2 using xda app-developers app
So i need to sell my octa one
Sent from my GT-I9500 using Tapatalk 4 Beta
What's the big deal guys? So big.LITTLE isn't working as expected. We're still using the A15 processor which is a beast. The only benefit of big.LITTLE is enhanced battery life and even that is on par with the other flagship devices.
crzr said:
I think that all of these issues will be corrected by Samsung surely.
Click to expand...
Click to collapse
One of the points OP made that's spot on is how competitive the mobile SoC business is. Unless Samsung's planning to exit the business which nothing points to, they'll have to keep innovating and providing cost effective, powerful, and energy efficient solutions to power not only future Samsung devices but other manufacturer's devices as well. Designing, engineering, and producing a SoC costs several hundred million dollars. Committing to big.LITTLE wasn't a "throw it against the wall and see if it'll stick" decision.
Samsung's supposed to sell 80MM SGS4's this year which means 24MM (1/3) of them will be Octa. HTC's projecting 20MM One's. That means there will be more Octa devices produced than all of HTC's flagship S-600 device. So I'd hardly call it a failure. And it’s great that Samsung’s experimenting with Intel. The 10.1” G-Tab 3 that’s supposed to have an Intel chip will probably sell 10MM units; if it’s the only SoC used which some are speculating it’ll only be for certain markets.
Here are some excerpts from Samsung’s Q1 earning’s release talking about their semi-conductor business. It certainly doesn’t sound like the division is in trouble or Samsung’s walking away from it any time soon.
As for this year's capital expenditure, Samsung Electronics executed a combined total of 3.9 trillion won for the quarter. The Semiconductor and Display Panel segments were each accountable for 1.5 trillion won ($1.3B USD) in capex spending. Samsung is poised to increase investment beginning from the second half of the fiscal year to preempt rising demand for differentiated products and to harness its competitiveness in the high-tech industry.
Samsung's Semiconductor businesses - including Memory and System LSI - posted consolidated 8.58 trillion won in revenue ($2.8B USD), a 11-percent drop from a quarter earlier. The Memory chip unit logged 5.12 trillion won in earnings but, compared with the previous period, quarter-on-quarter revenue retreated 4 percent.Profitability for the System LSI Business was hampered by seasonally slow demand in set products that use logic chips.Considering Octa is "V1" and that it out-benchmarks S-600 and gets very close to the same battery life the way it was shipped I'd very much call it a success. S-600 is nothing more than an evolution of previous Snapdragon chips. So that Samsung/ARM's first use of big.LITTLE is so competitive against a years old architecture that OEM’s are comfortable with implementing says a lot about what Samsung/ARM have done and what the upside potential is.
AnandTech talking about S4 - At present, this is the same Krait CPU as what we've seen in MSM8960 in phones like the USA versions of the Galaxy S 3 and HTC One X. Later on, Krait v3 will emerge with higher IPC and shorter critical paths (and clocks up to 1.7 or 2 GHz) and a resulting 10-15% boost in performance. For now however we're looking at 1.5 GHz APQ8064 with a Krait v2 inside and Qualcomm's newest scalar GPU architecture with Adreno 320.AnandTech talking about S-600 - Also being announced today is the Snapdragon 600. This part integrates four Krait 300 cores running at up to 1.9GHz. Adreno 320 handles GPU duties, although with an increased clock speed. Compared to the current Snapdragon S4, the 600 is expected to improve performance by up to 40% if you combine IPC and frequency increases.
I love this phone. ( I wish they'd fix this M****R F*****G Camera Reboot issue.) But I bought it because of the lies they told me.
hoezay said:
I love this phone. ( I wish they'd fix this M****R F*****G Camera Reboot issue.) But I bought it because of the lies they told me.
Click to expand...
Click to collapse
What lies? When did they tell anything about how the CPU will work? Can you provide a link please?