Isnt this cool? (can make phone a desktop). - Samsung Infuse 4G

https://www.youtube.com/watch?v=bB-icTl2J-c&feature=youtube_gdata_player watch for yourself, its cool. Maybe when we get ics, we can do this. Or maybe already on gingerbread we can. I can dream of having it on ics, then switching to tablet mode, gosh that could be my home computer in college! Lol.

Seems cool for a few mins. But then the infuse's 1.2ghz CPU 512mb will never match up to an overclocked core i5 with a SSD and 6 gigs of ram.
But all in all it would work for simple document editing and entertainment.

elliot.newnham said:
Seems cool for a few mins. But then the infuse's 1.2ghz CPU 512mb will never match up to an overclocked core i5 with a SSD and 6 gigs of ram.
But all in all it would work for simple document editing and entertainment.
Click to expand...
Click to collapse
Oh yea no doubt, but for people who do not need all the power could just use that. I would still jeep my laptop, but with that setup i would not use it as much. By then, we all will probably will have better phones then the infuse with at least dual cores.

Related

Tegra 3 = Beast!

Wow, just reading this and watching the video got me really excited!
Quote: "...its benchmark puts Kal-El at a higher performance bracket than even Intel's Core 2 Duo full-on-PC processors."
Enjoy: http://pocketnow.com/android/nvidia-quad-core-kal-el-in-android-devices-this-summer
I guess my next phone will somewhere on par with my [email protected], nah not quite but still impressive.
Its freakin ridiculous isn't it, I can't imagine how powerful wayne, logan, or even stark will be.
By the way, those are the architectures coming after Kal-El as seen in the roadmap here
http://www.anandtech.com/show/4181/...-a9s-coming-to-smartphonestablets-this-year/1
Can't wait for my Q6600 to have a little brother as well.
dreadlord369 said:
Its freakin ridiculous isn't it, I can't imagine how powerful wayne, logan, or even stark will be.
By the way, those are the architectures coming after Kal-El as seen in the roadmap here
http://www.anandtech.com/show/4181/...-a9s-coming-to-smartphonestablets-this-year/1
Click to expand...
Click to collapse
Wow sick! I had a feeling the technology was gona explode once dual core starts being implemented into phones but this is just ridiculous. I wander which C2D they are comparing to though. Can't wait to play some Crysis on my phone !!
omg it looks so cool!
7
Its lie, arm can not beat intel dual core cpus for next three year
It might be better then atom dual...
Sent from my LG-SU660 using XDA App
uhh three years is too long if they havent already beat some dual core chips, least thats what i think...specially since the kal-el and omap 5 cpus and whatever qualcomm have planned are gunna be freaking awesome!
OMG!!!! Its amazing
Mobile phones better than my first PC
Well since nvidia is supposedly releasing quad core in q4 of this year I say that computers will prolly eventually die out. Especially since this year smartphone sales beat computers...just a thought
HTC HD2 w/ 2.3 : )
CTR01 said:
Well since nvidia is supposedly releasing quad core in q4 of this year I say that computers will prolly eventually die out. Especially since this year smartphone sales beat computers...just a thought
HTC HD2 w/ 2.3 : )
Click to expand...
Click to collapse
Funny you mention that, I was just in uni talking about networking (my major) and technology and a classmate said the same thing. I would say it could happen in maybe 20+ years.
I would like to see a Tegra 3 rendering a complex 3D scene or something like that which would really show it's performance.
Is this the Q6600 club or what? <3
Sent from my HTC Vision using Tapatalk
I have an Athlon X3 435 at 3.6 ghz. Can go up to 3.8 ghz as well. But too much v-core.
Although they are saying these newer processors are supposed to be much more efficient, are these dual and quad core processors going to be a viable option with the today's battery technology?
Or is it going to be more of a "use it if it is available" for the app devs, and therefore negating any positive improvements in battery life?
icecold23 said:
Although they are saying these newer processors are supposed to be much more efficient, are these dual and quad core processors going to be a viable option with the today's battery technology?
Or is it going to be more of a "use it if it is available" for the app devs, and therefore negating any positive improvements in battery life?
Click to expand...
Click to collapse
Yeah, I don' think the battery technology is on par or evolving on par with the processors. At this rate, we'll have "stationary" tablets with the current battery technology.
icecold23 said:
Although they are saying these newer processors are supposed to be much more efficient, are these dual and quad core processors going to be a viable option with the today's battery technology?
Or is it going to be more of a "use it if it is available" for the app devs, and therefore negating any positive improvements in battery life?
Click to expand...
Click to collapse
It just really depends on how optimized these cores are for power. It's not adding cores that gives a higher TDP, it's the vcore of the core and the frequency the cores run at. But really, I can't see cpus going any other way but multicore or multithread. It's more efficient for power and performance to have 2 cores running at 1 ghz each, instead of having a cpu at 2ghz that will have a higher tdp and vcore to keep it stable. If the cores are a smaller die size, then it works out perfectly.
vbetts said:
It just really depends on how optimized these cores are for power. It's not adding cores that gives a higher TDP, it's the vcore of the core and the frequency the cores run at. But really, I can't see cpus going any other way but multicore or multithread. It's more efficient for power and performance to have 2 cores running at 1 ghz each, instead of having a cpu at 2ghz that will have a higher tdp and vcore to keep it stable. If the cores are a smaller die size, then it works out perfectly.
Click to expand...
Click to collapse
agreed...while im no cpu expert i do know the slight basics and did a little reading that agrees with vbetts. they said the 4 core kal-el nvidia cpu is supposed to have ~12 hours of play back for hd video...least thats what someone on a thread of mine posted...
vbetts said:
It just really depends on how optimized these cores are for power. It's not adding cores that gives a higher TDP, it's the vcore of the core and the frequency the cores run at. But really, I can't see cpus going any other way but multicore or multithread. It's more efficient for power and performance to have 2 cores running at 1 ghz each, instead of having a cpu at 2ghz that will have a higher tdp and vcore to keep it stable. If the cores are a smaller die size, then it works out perfectly.
Click to expand...
Click to collapse
Yeah exactly, I personally thought the move to dual core would be sooner with a 2 500mhz core cpu or lower since that would still be better then a single 1Ghz chip.
Battery is definitely a issue, with today's technology I wonder how much these chips will consume at 100% load or when playing a game which use most of the devices grunt. On my DHD I can take it up to 1.9Ghz stable and if i'm playing FPse while at that frequency current widget show consumption of around 425mA, while at 1Ghz it's around 285mA. That's quiet a difference! So in order for these chips to be efficient they shouldn't use much more battery then todays chips.
I love my Q6600, max OC I could get was 4Ghz but it required 1.6v Vcore and on air that was HOT. Still I made it into Windows and did some benching, 13.110s on a 1MB SuperPI/1.5 XS mod
CTR01 said:
agreed...while im no cpu expert i do know the slight basics and did a little reading that agrees with vbetts. they said the 4 core kal-el nvidia cpu is supposed to have ~12 hours of play back for hd video...least thats what someone on a thread of mine posted...
Click to expand...
Click to collapse
Yeah I read that as well or heard it somewhere.
Yeah exactly, I personally thought the move to dual core would be sooner with a 2 500mhz core cpu or lower since that would still be better then a single 1Ghz chip.
Click to expand...
Click to collapse
If the app is multithreaded capable, then yes. Easily better. But 2 500mhz cpus would probably be best for multitasking.
Battery is definitely a issue, with today's technology I wonder how much these chips will consume at 100% load or when playing a game which use most of the devices grunt. On my DHD I can take it up to 1.9Ghz stable and if i'm playing FPse while at that frequency current widget show consumption of around 425mA, while at 1Ghz it's around 285mA. That's quiet a difference! So in order for these chips to be efficient they shouldn't use much more battery then todays chips.
Click to expand...
Click to collapse
Battery has always been an issue though, even on my old Moment the battery sucked. But that's what you get with these I guess. But man, 1.9ghz stable from 1ghz! For a small platform, that's pretty damn impressive.
I love my Q6600, max OC I could get was 4Ghz but it required 1.6v Vcore and on air that was HOT. Still I made it into Windows and did some benching, 13.110s on a 1MB SuperPI/1.5 XS mod
Click to expand...
Click to collapse
Ouch, how long did the chip last? I've got my 435 at 1.52vcore. I can go higher but I need this chip to last me for a year or so.
I can't believe that they set the time-frame for the release as early as they did. Hopefully they will live up to this standard

Nvidia quad core cpu and 12 core gpu...WTF?!?!

here it is folks...this is what i read...
http://www.androidcentral.com/nvidi...ampaign=Feed:+androidcentral+(Android+Central)
yeah its cool and good for all the innovations but battery life? if decent battery life is to be had i think the optimization and battery size are going to be some kind of amazing lol
CTR01 said:
here it is folks...this is what i read...
http://www.androidcentral.com/nvidi...ampaign=Feed:+androidcentral+(Android+Central)
yeah its cool and good for all the innovations but battery life? if decent battery life is to be had i think the optimization and battery size are going to be some kind of amazing lol
Click to expand...
Click to collapse
They said 12 hours of HD video.
If it's going by their average designs, then it's not actual cores that you would see in a cpu. Most likely it's some type of shader core that you see in their GPU setups. More or less, a shader cluster from fermi.
muyoso said:
They said 12 hours of HD video.
Click to expand...
Click to collapse
12 hours huh thats freaking awesome lol!
vbetts said:
If it's going by their average designs, then it's not actual cores that you would see in a cpu. Most likely it's some type of shader core that you see in their GPU setups. More or less, a shader cluster from fermi.
Click to expand...
Click to collapse
hmm well that does sound interesting indeed! we will see : )
Just to add, not saying it will be a shader cluster from fermi, just an example of what it could be. Either way, it's looking pretty great. ARM has come a long way, and I bet soon we'll see competition to x86-64 platforms.
vbetts said:
Just to add, not saying it will be a shader cluster from fermi, just an example of what it could be. Either way, it's looking pretty great. ARM has come a long way, and I bet soon we'll see competition to x86-64 platforms.
Click to expand...
Click to collapse
yeah that would be pretty cool...could have like some more in depth or advanced phone to computer interaction or something lol
small related update...
http://phandroid.com/2011/02/16/nvidia-quad-core-tablets-in-august-phones-at-christmas/
benchmarks and possible release dates
lol?? fake?
Rumors yes but fake eh I say plausible
HTC HD2 w/ 2.3 : )
inubero said:
lol?? fake?
Click to expand...
Click to collapse
Those are benchmarks straight from Nvidia. That link even gives Nvidia as its source.
http://blogs.nvidia.com/2011/02/teg...-chip-worlds-first-quadcore-mobile-processor/
In all fairness to the benchmark, a T7200 (65nm) is a 5yo chip. So what you essentially have here are TWO extra cores build probably on a much smaller 30nm chip to edge out a 5yo dual-core desktop CPU.
So for a mobile chip to beat out something like a Q6600 (2.4ghz quad-core, 45nm), it would need 8 cores (or octa-core) and pack tightly in a 20nm or so chip to churn out an equivalent number. I don't think that's going to be available for another 5-7yrs or so.
lude219 said:
In all fairness to the benchmark, a T7200 (65nm) is a 5yo chip. So what you essentially have here are TWO extra cores build probably on a much smaller 30nm chip to edge out a 5yo dual-core desktop CPU.
So for a mobile chip to beat out something like a Q6600 (2.4ghz quad-core, 45nm), it would need 8 cores (or octa-core) and pack tightly in a 20nm or so chip to churn out an equivalent number. I don't think that's going to be available for another 5-7yrs or so.
Click to expand...
Click to collapse
Q6600 is 65nm
Not sure what the dye size is of the cores inside of Tegra, but definitely smaller and also uses a hell of a lot less power. If it were an x86 chip comparing to an x86 chip, then comparing it to a 5 year old cpu would be a very bad comparison. But it's ARM, and ARM has come a long way in a quick time. Nvidia is great at pushing limits too, Fermi may have used a ton load of power and put out heat, but it performs great, and a lot of the fermi based cards are smaller than the counter HD5 and HD6. Nvidia Ion is also easily cooled, and doesn't use a lot of power or put out a lot of heat.
How much horse power does a mobile device really need? I voted for a combination of three, but I would be inclined to lean towards battery life.
A2Aegis said:
How much horse power does a mobile device really need? I voted for a combination of three, but I would be inclined to lean towards battery life.
Click to expand...
Click to collapse
im sure you know but after a certain point people buy things just cuz they can and to brag about it lol
...although if you have non geeky/techy friends then it really means nothing to them so yeah i see it as at some point its just no use
...unless you get like a 3d hologram/projector thing going on...then we can talk 4 cores or more.
vbetts said:
Q6600 is 65nm
Not sure what the dye size is of the cores inside of Tegra, but definitely smaller and also uses a hell of a lot less power. If it were an x86 chip comparing to an x86 chip, then comparing it to a 5 year old cpu would be a very bad comparison. But it's ARM, and ARM has come a long way in a quick time. Nvidia is great at pushing limits too, Fermi may have used a ton load of power and put out heat, but it performs great, and a lot of the fermi based cards are smaller than the counter HD5 and HD6. Nvidia Ion is also easily cooled, and doesn't use a lot of power or put out a lot of heat.
Click to expand...
Click to collapse
Thank you for the correction! You'd think I would remember my own's CPU dye, right?
My previous post doesn't have any ill-intention at all. There are a lot of people who think this will replace future desktops (as noted in the recent article on the sales of smartphones/tablets beat out home office computers), so my somewhat horrible analogy was to paint a picture between these two different architectures and where they're at. I can almost remember back in the day when people laughed at the idea of Nvidia branching out to making energy-efficient mobile processors. They wont be laughing now
lude219 said:
Thank you for the correction! You'd think I would remember my own's CPU dye, right?
My previous post doesn't have any ill-intention at all. There are a lot of people who think this will replace future desktops (as noted in the recent article on the sales of smartphones/tablets beat out home office computers), so my somewhat horrible analogy was to paint a picture between these two different architectures and where they're at. I can almost remember back in the day when people laughed at the idea of Nvidia branching out to making energy-efficient mobile processors. They wont be laughing now
Click to expand...
Click to collapse
I honestly wouldn't be surprised if ARM replaced x86-64 in at least mobile solutions like laptops. Windows has ARM support coming soon. But the real question here is, can companies like Nvidia or whoever makes the SoC make the chip cheap enough for mass production and retail? AMD sells quad core cpus for less than $100, and Intel sells cpus with a gpu onboard the CPU chip for under $150.
hmm thats true and amd is coming out with their cpu/gpu integrated stuff soon-ish and since they usually sell cheaper than intel...at that price point it would be kinda hard to beat...least for a little while i think.

How does 512Mb. of Dual-Channel RAM compare to 1Gb. of full RAM?

I am wondering on how 512Mb DDR2 dual-channel memory (RAM) stacks up to 1Gb of full DDR2 memory (RAM)? Which is better in performance wise? Aimed at LG Thrill 4G ( Optimus 3D).
Sent from my Samsung Galaxy S II using the xda premium app.
512Mb DDR2 dual-channel the better
More info please
does it have a duel core processor?
i am pretty sure that the new lg phone your talking about has a duel core processor as well which allows it to process data more like a computer, basically it can do two things at once instead of one thing.
The way I understand it (or try to) is that dual-channel memory works a lot like dual-core processors. It's not as simple as just doubling the speed (like 1ghz dual core is just like 2 ghz, or 512 dual-channel memory is just like 1 gig)
It boils down to better efficiency in handling calls to the memory. Double the available roadway, traffic goes smoother. You get a faster speed (although not quite double speed), and you do so using less battery.
I might be totally wrong, but I'm sure a google search into the difference would yield tons of reading material.
cromag.rickman said:
The way I understand it (or try to) is that dual-channel memory works a lot like dual-core processors. It's not as simple as just doubling the speed (like 1ghz dual core is just like 2 ghz, or 512 dual-channel memory is just like 1 gig)
It boils down to better efficiency in handling calls to the memory. Double the available roadway, traffic goes smoother. You get a faster speed (although not quite double speed), and you do so using less battery.
I might be totally wrong, but I'm sure a google search into the difference would yield tons of reading material.
Click to expand...
Click to collapse
yes the RAM makes it more efficient, in handleing calls to the memory, and what calls the memory? the processor. and a duel core processor allows the computer to make two calls at once, it allows it to process data faster and more efficiently because it can handle two inputs instead of one. why do you think the numbers on processors core keep going up? their way higher for computers now. and optimus is the first phone to have a duel core, i believe, unless a apple product does but i don't have a clue about their phones.
So apparently dual channel ram is nothing but two 512mb pieces of RAM both exactly identical. The advantage over single channel RAM is that programs can access two pieces of data simultaneously whereas in single channel RAM a program can only access one piece, store it, and then read the next.
Obviously you still have the limitations of having 512mb total capacity - only being able to run so many apps simultaneously, but whatever you can run simultaneously should be able to access data twice as fast.
I read somewhere that the dual channel RAM isn't fully optimized in the Froyo build that shipped with the Thrill, but the Gingerbread update right around the corner will make better use of the dual channel technology.
vinvam said:
So apparently dual channel ram is nothing but two 512mb pieces of RAM both exactly identical. The advantage over single channel RAM is that programs can access two pieces of data simultaneously whereas in single channel RAM a program can only access one piece, store it, and then read the next.
Obviously you still have the limitations of having 512mb total capacity - only being able to run so many apps simultaneously, but whatever you can run simultaneously should be able to access data twice as fast.
I read somewhere that the dual channel RAM isn't fully optimized in the Froyo build that shipped with the Thrill, but the Gingerbread update right around the corner will make better use of the dual channel technology.
Click to expand...
Click to collapse
Ahaha, okay thanks man cleared it up for me. Looking forward to our trade and make sure to hand me your address by Friday!
Sent from my Samsung Galaxy S II using the xda premium app.

question cpu power

I posted this in the general forum but did not get an answer, so posting here hoping for a reply. Sorry if this is breaking forum rules, I doing think it is but....if it is flame away and delete ...anyways. I am just curious with the introduction of quad core tablets, how do they match up to similar spec CPU in raw power. I understand that android, iOS, and windows ( in the future) are mobile OS, So directly comparing the to a laptop is useless. I did however notice that the new t33 clocked and 1.6ghz is only .1 slower than my laptop to with is running a AMD quad core at 1.7ghz. So I'm just wondering is it a direct comparison in just processing power alone or is the architecture so different in the laptop and desktop that even at the same speed they win in the power category .
Totally different. Due to the ARM architecture, the CPU is a lot less powerful than comparably clocked CPUs using the x86 or x86_64 architecture.
Keep
jdeoxys said:
Totally different. Due to the ARM architecture, the CPU is a lot less powerful than comparably clocked CPUs using the x86 or x86_64 architecture.
Click to expand...
Click to collapse
Thank you for the reply. Do you have any idea on the scale? How fast would a arm CPU have to be clocked to equal a x86 or x64?
fd4101 said:
Keep
Thank you for the reply. Do you have any idea on the scale? How fast would a arm CPU have to be clocked to equal a x86 or x64?
Click to expand...
Click to collapse
There is literally no comparison. I have a crap old AMD Athlon 64 x2 clocked at around 3 ghz with ddr2 RAM (lolwut, in 2012?). It gets 3x better sunspider scores than my infinity does. I don't know if that's the browser or what but, I think an ARM CPU would have to be at least 5-6 times higher clocked to get similar performance from x86 CPUs. For modern day ones, I think maybe even up to 10-20x. Of course, this is just my talking out of my ass here, I don't really know the exact numbers.
Well I guess my dreams of have a tablet that is truly as powerful as my laptop are still far off. But with the way tech is progressing I'm sure we'll have it someday..
fd4101 said:
Well I guess my dreams of have a tablet that is truly as powerful as my laptop are still far off. But with the way tech is progressing I'm sure we'll have it someday..
Click to expand...
Click to collapse
Why? You didn't mean to play Diablo 3 on it, did you? The apps for tablets take this difference into account, so it is really a question of what apps suit your needs.
(BTW, my i3 laptop is only 4 times faster than Chrome on the Infinity running ICS, it will be probably only 3 times faster when the JB for the Infinity shows up; and we don't really need all the CPU power of i3/i5/i7 for casual web browsing..)
fd4101 said:
Well I guess my dreams of have a tablet that is truly as powerful as my laptop are still far off. But with the way tech is progressing I'm sure we'll have it someday..
Click to expand...
Click to collapse
Ms surface. Core i5. Although it won't have quite the god tier 16:10 resolution of the infinity.
I got b& at /g/ for sh!tposting ;_;
ARM Cortex-A9(same as Tegra3) is in between Intel's Atom and their Desktop x86 CPUs.
A dual core Cortex-A9 is considerably faster than an Intel Atom N270 in some operations. However, it is difficult to really compare as few benchmarks are optimized for both ARM *and* x86.
The ARM architecture's primary focus is low power while being inexpensive so it will be slower than Intel's x86 by design.
Although I realize that the power of tablets have along way to go before they are playing AAA games. I would like tablets to get to a point where they can run the same level of software ( optimized for mobile of course). Desktops will always be more powerful but as it stands right now my laptop can play pretty much any game my desktop can just on lower settings. I would like a tablets to replace this. The benefits of course lower power requirements for battery life and better mobility. I thought that with quad core tablets with ghz reaching closer and closer to laptops that we where getting close but I did not know enough about x86 & x64 to know it made so much of a difference. I need to take a computer class .
I know that the cloud can give the illusion of tablets having more power than they do, but the cloud has along way till it can be fully realized to many restrictions as it stand now. Even with tablets having 4g connection it still limits mobility through contracts, deadzones, lag and makes you pay more multiple times to do what you want. Maybe in the future the cloud will make all this a wash and well all carry thin lower power devices that only need to decode video and receive input, but I see that as along way away.
fd4101 said:
Although I realize that the power of tablets have along way to go before they are playing AAA games. I would like tablets to get to a point where they can run the same level of software ( optimized for mobile of course). Desktops will always be more powerful but as it stands right now my laptop can play pretty much any game my desktop can just on lower settings. I would like a tablets to replace this. The benefits of course lower power requirements for battery life and better mobility. I thought that with quad core tablets with ghz reaching closer and closer to laptops that we where getting close but I did not know enough about x86 & x64 to know it made so much of a difference. I need to take a computer class .
I know that the cloud can give the illusion of tablets having more power than they do, but the cloud has along way till it can be fully realized to many restrictions as it stand now. Even with tablets having 4g connection it still limits mobility through contracts, deadzones, lag and makes you pay more multiple times to do what you want. Maybe in the future the cloud will make all this a wash and well all carry thin lower power devices that only need to decode video and receive input, but I see that as along way away.
Click to expand...
Click to collapse
You better hope this never happens. The cloud = gigantic botnet. Google will take ALL your information and beam ideas directly to your head.
Lol well i m not sure anything can stop it but I'll start stocking up on tin foil, I'll make you a hat and ship it to you.
It's hard to say but in terms of gaming we are seeing some quite interesting developments. For example Max Payne and GTA 3 on a tablet is quite impressive if you think what kind of PC you had to own when this games were released.
Sent from my Galaxy Nexus using xda premium
And there's Baldur's Gate for Android coming
d14b0ll0s said:
And there's Baldur's Gate for Android coming
Click to expand...
Click to collapse
That's already possible since years.
https://play.google.com/store/apps/details?id=net.sourceforge.gemrb
Sent from my Galaxy Nexus using xda premium
That is true games have come along way through optimization. Maybe games with just be better optimized and hardware won't be such a concern.
Nebucatnetzer said:
That's already possible since years.
https://play.google.com/store/apps/details?id=net.sourceforge.gemrb
Click to expand...
Click to collapse
Looks cool, but the comments say otherwise. I meant the version intended for Android tablets.
d14b0ll0s said:
Looks cool, but the comments say otherwise. I meant the version intended for Android tablets.
Click to expand...
Click to collapse
I tried it quite a while ago so I don't know if anything has changed.
Sent from my Galaxy Nexus using xda premium
Basically they say it's slow even on some fast phones, gets FCs and you can't go into some menus without doing some crazy tricks. But it's nice to see some development like this. I'm still waiting for the official version, with some fright too, as it's pretty time-consuming..

Is it meaningful to have 2-cores, 4-cores, or 8-cores in the phone?

I use a phone with single core yet and I managed to buy a new phone recently, do you think it is meaningful to have 2-cores, 4-cores, or 8-cores in the phone?
doubleelec said:
I use a phone with single core yet and I managed to buy a new phone recently, do you think it is meaningful to have 2-cores, 4-cores, or 8-cores in the phone?
Click to expand...
Click to collapse
with the abundance of quad cores nowadays flooding the market, developers and apps would surely catch on optimizing apps to take advantage of
4 cores. As for octa cores, if you're seriously into multi tasking or using your phone/tablet for number crunching, then 8 cores would help.
...
I think today Quad core is must, and even better if it comes with 2GB ram.
The prices keep dropping and if it fits the budget it better.
For the Octa core- I think it's the same stupid race like with the Camera MP - "look!- I have 30MP!"
Strong Quad core with 2GB ram in enough I think.
The Bigger. The Better
I had a dual core phone with half a gig of ram until a few months ago..then i switched to quad core & a gig ram..i really didnt notice much of difference..until last week, when i switched back to my old phone(as it has beautiful AMOLED screen..i just love that)..i realized that my old is quite slow n can't keep up to me speed..
so..there u go..the bigger the better..
Same thing happened in PC industry.
The software, especially video games will use every bit of your cores,no matter how many they are.
Xperia-Ray said:
For the Octa core- I think it's the same stupid race like with the Camera MP - "look!- I have 30MP!"
Strong Quad core with 2GB ram in enough I think.
Click to expand...
Click to collapse
totally agree with you! :good:
I think that the Quad-Core cpu is the best, it will works perfectly for years and years (that s my opinion)
But the software optimization is even more important
Sorry for my bad english

Categories

Resources