So, with the new Samsung Galaxy on its way (waiting for some carries to get a move on) there (to me) seems to be a possibility to get a PS2 emulator running quite well with the new specs.
1Ghz HummingBird "Cortex A8"
PowerVR SGX540
---{"Samsung Galaxy S’ “Hummingbird” A8 chip will be able to process around 90 million triangle per second. That is compared to the Moto DROID’s 7 mill tri/sec, the Nexus One’s 22 million tri/sec, and the iPhone 3G S’ 28 million tri/sec."}---
---{"In other words, the Samsung Galaxy S will have around 36% the video processing power of a PS3. Hopefully it doesn’t get as hot as a PS3."}---
With this in mind I would think that is it quite possible to run a PS2 emulator on the new Samsung Galaxy S. Not to mention the rumored 1.5Ghz dual core Snapdragon coming to T-mobile either this Christmas season or early next year.
One thing to remember, is that although a PC with say a 3Ghz Dual core with 4Gb ram trying to run a PS2 emulator runs like crap, the architecture of the PC processor and graphics is different form that of consoles, which is why it requires to much to get a smooth play out of it. Cell phones share a very similar structure (from my knowledge at least) to consoles. This to me says that newer android phones should be quite capable of running a PS2 emu.
If you head over to the GLBenchmark website (.com) and look up the result database you will see the Galaxy S at the top (minus a comal naz-10, whatever that is) and if you compare the Galaxy S results with the Droid, Droid X, Droid 2, Iphone 4, you will see that it just rapes each phone by a huge range. I am not sure of playstion 2 specs but I am more then sure the phone should be able to handle it!
Playstion2 specs can be found on wikipedia (will not copy and paste all that info.)
To me it seems like its highly possible, and I would love to play my racing games on the phone (Tokyo Extreme Racer Drift2, TXRD2)
Thoughts and opinions welcome, no bashing (I get this in other forums).
even a 3 Ghz i7 isn't able to emulate a ps2 @fullspeed (depending by the emulated game - sure, there are many playable games - i know that because im interested in emulation and tested many games (search youtube for "frankyfife"). there is many code to translate by the emu, to produce native code for the plattform running on. the ps2 has vector units, the emotion engine, spu and gs which need to be emulated. no way to do this an a 1 Ghz cellphone, even with similar specs or identical main cpu architecture/function.
I really hate to be a nerf herder but if a 1Ghz snapdragon droid can play playstation one games, and the galaxy s with 1Ghz hummingbird and graphics chip that is way more powerful then the droid should be able to handle it fine.
Take for example facts that lead to a hypothesis of power.
Motorola Droid: TI OMAP3430 with PowerVR SGX530 = 7-14 million(?) triangles/sec
Samsung Galaxy S: S5PC110 with PowerVR SGX540 = 90 million triangles/sec
These results are based off SOME facts with SOME uncertainties that leads to a hypothesis. If this is INDEED the case, the galaxy S is ALMOST 7 times more powerful then the droid (6.4xxxxxxxx when 90 is divided by 14). And your saying that it can't handle it without trying? I've seen youtube video's of phones playing playstation games smoothly with little jitterbugging and medium quality sound. Take into account the faster processor and cpu in the galaxy s and you use less resources to play the game, leaving more for sound processing, which in turn will make the ps1 games run perfect (theoretically) and possible ps2 if not DECENT ps2.
EDIT: Not to mention the PS3 running at 250 million triangles/sec, that makes the galaxy s like 38 some % of a ps3!
No, just no. It can't be done with cellphones as @xdaywalkerx said. I have been able to play Guilty Gear and some visual novels on PS2 emulator on my i5 @ 4.00GHz and with 4GB of DDR3 RAM. Unless you find a way to efficiently emulate all the hardware in PS2 it is impossible.
Quintasan said:
No, just no. It can't be done with cellphones as @xdaywalkerx said. I have been able to play Guilty Gear and some visual novels on PS2 emulator on my i5 @ 4.00GHz and with 4GB of DDR3 RAM. Unless you find a way to efficiently emulate all the hardware in PS2 it is impossible.
Click to expand...
Click to collapse
PC Processors and GPUs work completely different then consoles, that's why it takes so much power to even try to squeeze out performance. Phones have the same if not extremely similar processors and gpu's (at least how they are made and how they work).
Running a emulator on a phone is different then a PC. If the droid can run final fantasy and other games from playstation one, then what is the galaxy gonna be able to do with over 6x more graphics processing power?
keep on dreaming
Just stop, it is impossible. It doesn't matter if the architecture is similar, you're still emulating which takes way more resources than the native machine requires.
Sent from my PC36100 using XDA App
namcost said:
PC Processors and GPUs work completely different then consoles, that's why it takes so much power to even try to squeeze out performance. Phones have the same if not extremely similar processors and gpu's (at least how they are made and how they work).
Running a emulator on a phone is different then a PC. If the droid can run final fantasy and other games from playstation one, then what is the galaxy gonna be able to do with over 6x more graphics processing power?
Click to expand...
Click to collapse
You can't just take theoretical numbers like that and simply assume that just because the Hummingbird can crunch out (throwing a random number right here) 15 million polygons/second, it doesn't mean that it can emulate PS2 titles and crunch out 15 million polygons/second emulating a PS2 title.
As xdaywalkerx said, the Emotion Engine is much more difficult to emulate when compared to the PlayStation 1's MIPS R3051. PS2 emulation is not even well done on Windows computers; not necessarily because of the lack of CPU/GPU power, but the difficulty in emulating the titles as well.
Hell, the Droid can't even run every single PS1 title available, even when overclocked.
how about a psp emu? some psp games look and feel like ps2 games.
Maybe possible with very dumbed down graphics and super-low resolution... but then would it look like ps2? Probably not
SNES StarFox and Stunt Race FX don't run full speed on my Galaxy S.
Burnout 3? Vice City? GOW? MGS2? No chance.
But a Sega Saturn emulator...well...
I've seen the captivate run crash bandicoot 3 on psx emu @ full speed with no problems, just lack of control since its touch screen and requires quick reactions.....
It's simply not possible.
I'd say... it won't work. The processor wouldn't even run it...
The GPU would fail.
However,
A psp emulator, could potentially work.
The facts
You see, a standard PSP (not the PSP Go) is overclocked automatically to 333mhz for SOME games... This 333mhz is the maximum. Most games run at 266mhz. To Emulate something you need roughly 4 times the processing power. And for graphics, you also would need a decent GPU.
So processing wise, a PSP emulator for phones is actually very possible. The graphics could possibly be pulled off.
But this would only work on High end phones with a decent enough screensize... e.g. the streak, droid (X) to name a few.
Edit:
Did some research.
Pixel Fill Rate of the PSP's GPU is 664 Megapixels per second, on a high end phone the GPU is around 133 to 250 Megapixels per second. The PSP does 33 Million triangles a second.. Whereas, we'll get possibly 7 to 22 million triangles per second. This shows that even a emulating a PSP entirely would be impossible... However you COULD emulate it. It just never would be full speed..
So if a PSP, won't run perfectly, I'm afraid a PS2 emulator won't.
Synyster_Zeikku said:
Pixel Fill Rate of the PSP's GPU is 664 Megapixels per second, on a high end phone the GPU is around 133 to 250 Megapixels per second. The PSP does 33 Million triangles a second.. Whereas, we'll get possibly 7 to 22 million triangles per second. This shows that even a emulating a PSP entirely would be impossible... However you COULD emulate it. It just never would be full speed..
So if a PSP, won't run perfectly, I'm afraid a PS2 emulator won't.
Click to expand...
Click to collapse
Samsung Galaxy S is rumored to be super powerful compared to the measly droid.
It is also rumored to have 90 million triangles per second.
http://www.androidpolice.com/2010/07/03/samsung-galaxy-s-is-a-beast-runs-quake-3-perfectly/
I hate to be an ass but the PS3 has 250 million triangles per second from what I've seen around the web (rsx chipset?), the psp is no where near that entirely. PS3 runs the RSX chip? or w/e it is, and its said to run 250 million triangles per second, and also seen a comparison (but i don't really believe it) says the 360 does 500 million triangles per second.
"66 million vertices / triangles per second calculated by the Emotion
Engine, and 75 million triangles per second can be drawn by the
Graphics Synthesizer (obviously the EE can only feed 66M per second to
the GS, thus as a result the EE can never overload the GS "
Click to expand...
Click to collapse
"PSP can *calculate* 33 or 35 million vertices / triangles per second
at the full 333 MHz clock frequency, which currently restricted to 222
MHz, so that cuts vertex / triangle rate down by 1/3. so, this
33~35 million per sec is currently at about 22-23 million per sec. at
222 MHz. Remember, this is the amount that can be transformed /
calculated, so you can think of this PSP triangle/sec number as you
would the 66M per sec that Emotion Engine in PS2 does. "
Click to expand...
Click to collapse
http://www.tomshardware.com/forum/33327-13-versus-triangles-second
I still think its possible with newer phones, especially if the dual core 1.5ghz snapdragon comes out @ christmas like its rumored.....
You're confusing two entirely different things.
Yes, high-end Android phones are able to run games that are similar in graphics to the PSP/PS2.
But emulation? Impossible. To emulate a system, you generally need to be at least 3 times as powerful, and that's probably way too little.
If it was this easy, you'd think the people that made the PS2 themselves would be able to emulate it on the PS3.
Lesiroth said:
You're confusing two entirely different things.
Yes, high-end Android phones are able to run games that are similar in graphics to the PSP/PS2.
But emulation? Impossible. To emulate a system, you generally need to be at least 3 times as powerful, and that's probably way too little.
If it was this easy, you'd think the people that made the PS2 themselves would be able to emulate it on the PS3.
Click to expand...
Click to collapse
They did emulate it on the PS3, they took it out on the newer models for god knows what reason. I have the original PS3 from launch and it plays all my PS2 games without a hickup.....
And where do you get this 3x more powerful, if that's the case, my dual core amd 3.0ghz with 4 gig of ram and a 5770 should run ps2 games just fine and it dont, its laggy.
Emulation on a PC is massively different then emulating on a phone. The phones shares more architecture with consoles then actual PC's do, hence why phones are just now hitting the 1ghz and 1.5ghz level. There are already videos of the galaxy s running crash bandicoot 3 with the droid emulator set to 60fps max and it runs perfectly, and I mean PERFECTLY. (except lack of controls). The Galaxy S also runs quake 3 arena perfectly (minus lack of controls, but that one i think can be solved with a simple bluetooth mouse and keyboard?).
Its possible, people just like to write it off..... w/e, I'm done with this website, too many haters with no facts.
namcost said:
They did emulate it on the PS3, they took it out on the newer models for god knows what reason. I have the original PS3 from launch and it plays all my PS2 games without a hickup.....
And where do you get this 3x more powerful, if that's the case, my dual core amd 3.0ghz with 4 gig of ram and a 5770 should run ps2 games just fine and it dont, its laggy.
Emulation on a PC is massively different then emulating on a phone. The phones shares more architecture with consoles then actual PC's do, hence why phones are just now hitting the 1ghz and 1.5ghz level. There are already videos of the galaxy s running crash bandicoot 3 with the droid emulator set to 60fps max and it runs perfectly, and I mean PERFECTLY. (except lack of controls). The Galaxy S also runs quake 3 arena perfectly (minus lack of controls, but that one i think can be solved with a simple bluetooth mouse and keyboard?).
Its possible, people just like to write it off..... w/e, I'm done with this website, too many haters with no facts.
Click to expand...
Click to collapse
Well, emulating process is the same on all architectures - creating virtual machine and "translating it" to be understandable for device's architecture. Of course it's not that simple, but hope you understand . Even if sb wrote PS2 emulator, I doubt it'll have over 5 fps.
Quake 3 is running smooth, because it's running natively (ported engine for ARM and GPU is supporting OpenGL, which quake uses). Maybe PSX is running great on Galaxy S, but even my very old PC with Pentium III 400MHz and geforce 2 mx could run it at full speed
Oh and your PS3 is running PS2 games smooth, because first consoles had PS2's chip inside . They removed it later.
How about you get your facts straight first?
It was on the first batch of PS3s because Sony put some of the PS2s hardware in the PS3, as they couldn't possibly launch without backwards compatibility.
They took the PS2 hardware out later to reduce costs.
Emulation on phones is not "massively different" than PCs, our phones use ARM architecture CPUs, while the PS2 uses MIPS processors for its Emotion Engine.
make an emulator that works and we will buy it. shouldn't be hard since you seem to know a lot about it
Related
Well, we know the under-performing WM devices with qualcomm CPU, especially when coming to graphics. For example the MSM7200(A) & MSM7201A devices, such as: HTC Touch Pro, HTC Diamond, HTC HD, HTC TyTN II, etc
Reading different docs from qualcomm:
QUALCOMM provides wide range of best-in-class integrated
graphics solutions with the MSM7200 comparable to the DS
or PSP
APIs Accelerated: OpenGL ES 1.0 Common + some OpenGL ES 1.1, Direct 3D Mobile, SM2, JSR 184, BREW Render2D, Direct Draw, GDI
“Qualcomm Announces Highly Integrated Dual-CPU Single Chip Solutions for High-Performance Multimedia Wireless Devices”
“The 7xxx series addresses the growing consumer demand for higher-performance wireless devices delivering high-quality audio-visual and 2D/3D gaming”
“Very high performance 2D and 3D graphics, and video encode and decode support”
Peak performance: 3D: 4M TRIS /SEC, 2D: 133M PIXELS /SEC
Source:
http://www.qualcomm.com/news/releases/2003/press1217.html
http://sakajati.com/download/?nav=display&file=70
You can see how pathetic performs such a qualcomm device here: http://www.youtube.com/sergiowmo . You can find many videos where I've tested different games. They say that the graphics is comparable with PSP.. just look at the emulators comparison.
Where is the promised graphics performance?
Can this company be sued because of these notorious lies?
I'm starting to hate this company more & more (qualcomm). I don't care if htc has to pay for drivers, etc. They should provide what they promise. And anyway TP, Diamond comes with openGL HW drivers.. and let's be real, the performance is extremly poor in comparison to other HW-enabled devices (older devices, devices with lower CPU).
What irritates me more is the high graphics performance Qulacomm is advertising and in reality perform so badly. In other words I hate when someone is misleading and lying such way.. Just thinking at PSP (PlayStationPortable) comparison..
I learned my lesson with the HTC Touch Dual, the models of the brand are beatifull, but slow and dumb, like that crazy blond you will always stay away from, in the future.
It's not qualcomms fault, the devices might be capable of this performance, but it's HTCs fault, which refused to pay for drivers.
But the latest htc devices DOES have driver for 3D graphics. They have OpenGL hardware libraries and I'm really not satisfied with the 3D performance.
I've already seen this excuse so many times.. but ALL WM the devices that are using Qualcomm CPU are worse. Other vendors such as LG, TOSHIBA. All those companies haven't payed for the drivers? There's something in the middle..
http://www.glbenchmark.com/compare....tege&D2=HTC P4550 TyTN II (Kaiser)&D3=LG KS20
What's with this qualcomm crap policy? Haven't heard anything like this from Marvell, to promise something and in reality to be something else.
The 2D graphics aren't hardware accelerated indeed.. at least this is how it looks..
@twolf Samsung Omnia looks good and it's fast and smart too, thanks to the Marvell CPU. Unfortunately there's no OpenGL support .
Excellent thread. Just a couple of things to bear in mind though:
- There are developers working on graphics here.
- The Mobinnova ICE and LG Incite are both said to have the same processor as the SE X1 Xperia, Touch Diamond, Touch Pro and the HD, i.e. the Qualcomm MSM7201A 528Mhz processor. Hence we should wait for reports from owners of these devices to see if any drivers have come to fruition. The reason I say this is because in the past, it seems some drivers were taken from the LG KS20 (same MSM7200 400Mhz processor as TytN 2, Touch Cruise, Touch Dual and Sprint CDMA Touch) in the past for the aforementioned.
However, this is an important thread, so please keep it going, unless Qualcomm can clearly not be brought to account here.
DSF said:
@twolf Samsung Omnia looks good and it's fast and smart too, thanks to the Marvell CPU. Unfortunately there's no OpenGL support .
Click to expand...
Click to collapse
I was playing around with an Omnia at the local Verizon Wireless store [only US carrier to offer it], and I was really f**kin impressed! The camera was as responsive as a good digital, and the screen/mouse combo is beautiful!
I am not an expert but I am totally agree withthe disapointing qualcomm...
Just let´s hope that HTC has noticed this and take it in mind for the next generation of 09 models just about to launch!
Great thread!
@nuke1 I've read that topic, but no resolution yet. Also, many people are benchmarking using diffrend D3D drivers, which is wrong, because that benchmark tool is using OpenGL not D3D (which is only a wrapper for OpenGL, D3D<=>OpenGL).
So.. we have 3D hardware accelerated drivers.. (unlike previous HTC devices, such as TyTN II) but we got poor performance.. I really would like to trust that porting the drivers from another device (eg: LG Incinte) will improve the (3d/2d) graphics experience on our phones.
@orb3000, actually they will continue the partnership with qualcomm, HTC is very proud of their colaboration, CEO Peter Chou said something like:
"qualcomm is one of our top very important ??? partnership ..
and i believe that this partnership will continue to go the next 10-20 years"
See http://www.qualcomm.com/who_we_are/success/index.htm#/HTC-video/ .
So, HTC seems preety happy with qualcomm solution.. yeah, i know that qualcomm provide a chipset (SoC) implementing various function, such as and not limited to: cpu, gpu, gps, wireless ...
Thanks guy for your support. Now let's spread to a global scale! I had enough of qualcomm lies or whateva.
@NotATreoFan I really hope that Samsung will focus on more powerfull WM devices and get ride of the damn proprietary connectors! Samsung really has potential.
I totally support this thread.
I bought my Diamond with a 528MHz CPU. This means, that this phone has to have a power of 528MHz CPU equipped phone. And I don't care how it will be done, and whether it will be in next models. Tbh I would feel screwed if they used the CPU in a better way in upcoming phones.
Also I wonder why we have 3D acceleration, but no 2D? Anyways, I feel we have some kind of 2D slowdown. My previous smart phone was a Siemens SX1 (S60, Symbian 6.1, 120MHz OMAP CPU, released in 2003 or 2004, 176*208 res). I was able to play fluent Sega Master System (SMS Plus S60, free and excellent) games with sound, nearly-perfectly fluent Picodrive (no sound for S60, some frameskip but no close to Touch Pro vids on youtube) and well, playable GBA (vBagX trial, commercial). Picodrive is not tested, Sega Master System via morphgear (atleast the trial) is less fluent, and GBA via PocketGBA has too much frameskip. And please don't tell it is everything because screen is X times bigger. Architecture of CPU should be better. Come on, we have also a CPU to emulate, sound, input... And I am pretty sure SX1's OMAP had no graphics chip inside.
Just wanted to jump in and make a quick small comment:
I have a vogue, it is terrible (speed, graphic performance etc..)
I also have android on that vogue, and when using android things are SSOOOO much smoother it blows my mind, and makes me realize how much is truly the fault of another MS crap product.
In android I can flick that screen to scroll in the browser and it is so smooth with never a single hiccup or anything of the sort...
My 2 cents,
Jim.
That's one of thereasons why SONY broke up with HTC
I guess SONY will continue to move on over the WinMo platform, maybe with another partnership with another company (ASUS?)
HTC should be worried now that SONY is in town
I bet we will start to see better devices as the second generation Xperia soon
An after having the tytn, diamond, etc, I can confirm the XPERIA performance blows away them all, OMG no comparison here: it's a step further
here is what the Iphone can do on an ARM based CPU running at 400Mhz with hardware accelerated graphics. notice the framerate (smoothness).
http://www.techeblog.com/index.php/tech-gadget/need-for-speed-iphone
it disgusts me that the 528Mhz Qualcomm chip that has 3D acceleration (but no working drivers) relies on the CPU core to do all the "desktop" windows drawing and even 3D (unless you hacked drivers).
Two things!
1. CPU or GPU overload?
Today I did a small test. I thought that if the CPU is overloaded any more load will slow down the emulator. So I fired up WMP with a MP3 and the emu. I felt no slowdown at all.
Don't forget we don't drivers for CPU. We need them for GPU.
2. HTC/Qualcomm vs math
MSM7201A can render 133000000 pixels a second.
A fluent gameplay is 60 fps.
By dividing 133M by 60 we get 2216666 frames each 1/60th a second.
VGA screen is 640*480 pixels = 307200.
So how much screens we can fill during 1/60 of a second? Let's see... 2216666 / 307200 = 7.215709635 screens to fill.
Where we lose 6 screens?
p3ngwin said:
here is what the Iphone can do on an ARM based CPU running at 400Mhz with hardware accelerated graphics. notice the framerate (smoothness).
Click to expand...
Click to collapse
That's the industry favorite 3D powered by PowerVR from Imagination Technologies. If you look at most of the great performing 3D out there right now (N95, etc), it's usually has PowerVR components in it or Nvidia, NOT ATI.
It also helps they optimized the drivers in the system end-to-end (instead of optimizing for TouchFlo), and it's not using HTC nor Qualcomm.
NuShrike said:
That's the industry favorite 3D powered by PowerVR from Imagination Technologies. If you look at most of the great performing 3D out there right now (N95, etc), it's usually has PowerVR components in it or Nvidia, NOT ATI.
It also helps they optimized the drivers in the system end-to-end (instead of optimizing for TouchFlo), and it's not using HTC nor Qualcomm.
Click to expand...
Click to collapse
they have different graphics hardware, yet we should be getting an experience that is at LEAST recognizable as in the same graphics ballpark as opposed to the pathetic sorry state we have now.
p3ngwin said:
here is what the Iphone can do on an ARM based CPU running at 400Mhz with hardware accelerated graphics. notice the framerate (smoothness).
http://www.techeblog.com/index.php/tech-gadget/need-for-speed-iphone
Click to expand...
Click to collapse
ah yes... need for speed mobile.. i need an fruit phone now!!
Oh yeah how great it is. I understand graphics are really great but controls are tragic. Playing this using accelerometer must be a pain. I played asphalt gt racing on iphone and I turned it off after 2 minutes (I'm surprised that I lasted this long trying to play it).
If apple want to make iPhone good for mobile gaming give it some psychical button or help guys working on iControlPad with it and release it ASAP.
For now iPhone is nice graphics but lacks at control.
Unluckily all phone manufacturers (including HTC) seem to get rid of d-pads.
What you got in Diamond is barely useful. Up/down is good, left/right is usable only with one hand if you want to be fast (left hand for right, and right hand for left), and the center button is easy to press.
back to topic: thanks to Qualcomm even if we had awesome joysticks we wouldn't be able to play better games with them.
If you know how to use dpad on TD it gets quite nice.
Besides poor dpad is better than no dpad, at least you don't have to twist your arms like madman trying to turn when playing games like need for speed or asphalt racing
Wishmaster89 said:
If you know how to use dpad on TD it gets quite nice.
Besides poor dpad is better than no dpad, at least you don't have to twist your arms like madman trying to turn when playing games like need for speed or asphalt racing
Click to expand...
Click to collapse
Poor dpad is inexcusable on a $800 device. You also must be a horrible Wipeout player.
I've just emailed the folks at Rubicon development complaining that their Great Little War Game (which is a great game btw) is laggy on my ASUS EEE Pad Transformer which is powered by the Tegra 2 SOC when you set the detail to high or ultra. Setting it to medium plays the game smoothly. Some folk posted a review of the game in the Android market praising the game runs smoothly even on his Incredible S which runs on the Adreno 205 GPU. Running the game on high / ultra adds in an excellent water effect which causes the game to slowdown on the Transformer.
I highlighted this to the Rubicon developers and one of them said that while the Tegra 2 CPU is powerful, the GPU isn't as powerful as the Adreno 205 GPU on the Incredible S. I seriously find that really really hard to believe because the benchmark figures here shows that Tegra 2 even beats the PowerVR GPU in Nexus S which is more powerful than the Adreno 205:
http://androidandme.com/2011/03/news/tegra-2-benchmarks-motorola-atrix-4g-vs-lg-optimus-2x/
So I emailed them to see if they can actually optimize it for Tegra 2 because as I understand it, you have to optimize your game for Tegra 2 chipsets in order to make full use of the GPU much like in games such as RipTide GP which has excellent graphics and I dont believe the Adreno 205 could render the water effect that brilliantly.
Could someone shed some light on this?
Update:
Here's what the developers of "Great little war game" have to say about tegra 2 tablets:
We've done just that with v1.0.4 - have a play in the new settings screen.
We've kind done all we can do now tbh, of the 4 different GPU's in Android phones, Tegra 2 comes bottom and its in most of the devices with the biggest screens.
---
Can you believe that? Sounds to me they are getting lazy rather than trying to optimize it for Tegra 2 because right now, they have two modes in the game with the new update - a fast graphics and a best graphics mode. Fast graphics mode appears to set the game resolution at a horrible 640x480 to speed things up but makes things very very ugly with heavy pixelation on tablets. while best graphics uses the native resolution, everything is sharp but slows down on certain larger levels of the game.
I suggested to them to change this to 800x600 when toggling to fast graphics which i believe will reduce the pixelation effect while still maintaining a decent graphics for the player. What do you guys think?
Nexus S should be right about the same hardware as the Galaxy Tab, and I do run RipTide GP at full quality equally smooth on that thing. Tegra 2 is a kinda all around lame chipset, but you really notice the single core when a mail comes in, and the TF keeps you splashing around while the Galaxy stutters badly for a moment... Also no choice on the chipset at the moment if you want a tablet.
Sent from my Transformer TF101 using Tapatalk
AlexTheStampede said:
Nexus S should be right about the same hardware as the Galaxy Tab, and I do run RipTide GP at full quality equally smooth on that thing. Tegra 2 is a kinda all around lame chipset, but you really notice the single core when a mail comes in, and the TF keeps you splashing around while the Galaxy stutters badly for a moment... Also no choice on the chipset at the moment if you want a tablet.
Sent from my Transformer TF101 using Tapatalk
Click to expand...
Click to collapse
Nexus S is powered by the PowerVR GPU - its not the same as the Galaxy Tab at all (i'm referring to the Galaxy Tab 10.1).
And I'm talking about the one they sold one year ago, the 7" Froyo little monster using the same Hummingbird coupled with an SGX 540 that the Nexus S has
Sent from my Transformer TF101 using Tapatalk
AlexTheStampede said:
And I'm talking about the one they sold one year ago, the 7" Froyo little monster using the same Hummingbird coupled with an SGX 540 that the Nexus S has
Sent from my Transformer TF101 using Tapatalk
Click to expand...
Click to collapse
how did you manage to run RipTide GP which is a Tegra only game? I suppose you used something like chainfire 3d? ( I havent tried it tho)
Exactly, Chainfire 3D and the Tegra plugin. No settings changed it just works, exactly like the pinball game. I didn't try any other game.
It requires a lot more power to render a game at 1280x720 than 800x480. As for which SoC is more powerful, I think benchmarks prove that Tegra 2 > Snapdragon S2.
AlexTheStampede said:
Exactly, Chainfire 3D and the Tegra plugin. No settings changed it just works, exactly like the pinball game. I didn't try any other game.
Click to expand...
Click to collapse
I see. cool. but it still doesnt make sense why Rubicon mentioned that the Adreno 205 is more powerful than the Tegra 2.
Killer Bee said:
It requires a lot more power to render a game at 1280x720 than 800x480.
Click to expand...
Click to collapse
this part i totally agree in terms of resolution but to say the Adreno 205 GPU is more powerful than the Tegra 2 GPU is wrong in every sense. Just because a game plays well in 800x480 doesnt make that GPU more powerful than the Tegra 2 GPU which plays game on 1280x800 in tablets. If it were the same resolution in the incredible S i'm pretty sure the game would be just as sluggish or worse.
mlbl said:
Could someone shed some light on this?
Click to expand...
Click to collapse
Assuming the game runs at the Transformer's native resolution, the ULP GeForce would be required to push nearly three times as many pixels as the Adreno 205 in the Incredible S. Consider then that the ULP GeForce is maybe 50% faster in best-case scenarios.
As for Tegra optimizations, all that really is is a proprietary texture format that only Tegra chips can use. It provides a few benefits.. that can already be implemented in OpenGL ES 2.X anyway.
Edit:
Man it obviously took me a long time to type that.. got ninja'd three times over..
mlbl said:
I see. cool. but it still doesnt make sense why Rubicon mentioned that the Adreno 205 is more powerful than the Tegra 2.
Click to expand...
Click to collapse
Just because they are developers, doesn't mean they aren't ignorant of certain hardware.
Sjael said:
Assuming the game runs at the Transformer's native resolution, the ULP GeForce would be required to push nearly three times as many pixels as the Adreno 205 in the Incredible S. Consider then that the ULP GeForce is maybe 50% faster in best-case scenarios.
As for Tegra optimizations, all that really is is a proprietary texture format that only Tegra chips can use. It provides a few benefits.. that can already be implemented in OpenGL ES 2.X anyway.
Edit:
Man it obviously took me a long time to type that.. got ninja'd three times over..
Just because they are developers, doesn't mean they aren't ignorant of certain hardware.
Click to expand...
Click to collapse
Your explanation makes a lot of sense and yep doesn't mean they are developers, they're always right. Its possible they've made a mistake there and just make some assumptions on their own instead of basing off facts.
To begin with, the "Tegra Zone" means quality exactly like say, Amazon AppStore. Example? Galaxy on Fire 2 runs smooth-ish (I'd say 20fps or so) on an iPhone 3G. Not a 3GS. The mighty 600mhz (is it still underclocked to 400?) money printer Apple sold in 2008 with 128mb of ram. But hey, I'm sure the resolution of the screen is low enough to counterbalance the amazing graphics possible only thanks to Nvidia!
Anyway the Galaxy Tab runs at 1024x600 that is closer to the 1280x720 of the TF (the lower bar is 80 pixels, and there isn't any game using those obviously).
Here's what the developers of "Great little war game" have to say about tegra 2 tablets:
We've done just that with v1.0.4 - have a play in the new settings screen.
We've kind done all we can do now tbh, of the 4 different GPU's in Android phones, Tegra 2 comes bottom and its in most of the devices with the biggest screens.
---
Can you believe that? Sounds to me they are getting lazy rather than trying to optimize it for Tegra 2 because right now, they have two modes in the game with the new update - a fast graphics and a best graphics mode. Fast graphics mode appears to set the game resolution at a horrible 640x480 to speed things up but makes things very very ugly with heavy pixelation on tablets. while best graphics uses the native resolution, everything is sharp but slows down on certain larger levels of the game.
I suggested to them to change this to 800x600 when toggling to fast graphics which i believe will reduce the pixelation effect while still maintaining a decent graphics for the player. What do you guys think?
Sjael said:
Just because they are developers, doesn't mean they aren't ignorant of certain hardware.
Click to expand...
Click to collapse
I gotta bump that one up. I'm in a development shop and the majority of the developers wouldn't know what CPU they had if there wasn't a sticker on their laptop.
Understanding software and hardware is a completely different skill set.
Well, could just be gpu optimization. Adreno uses a VLIW5, similar to AMD gpu, with an extra scalar unit, which apparently is hard to develop for, but can yield great results on shader heavy prgrams that are designed for it (or just happen to favor it). The Adreno 205 does outpace the [email protected] in vertex shader heavy benchmarks... and most implementations of snapdragon only use one of its two memory channels! ( because the first is PoP, but thesecond must be off package).
Sent from my Transformer TF101 using Tapatalk
while the Tegra 2 CPU is powerful, the GPU isn't as powerful as the Adreno 205 GPU on the Incredible S.
Click to expand...
Click to collapse
That's the most retarded thing I've ever heard. Rubicon mainly develops for the iOS platform; I'd reckon they are just lazy at optimizing for several different SOCs.
grainysand said:
That's the most retarded thing I've ever heard. Rubicon mainly develops for the iOS platform; I'd reckon they are just lazy at optimizing for several different SOCs.
Click to expand...
Click to collapse
There is another point I didn't address. Adreno 205 is Unified shaders. So is sgx540. Tegra 2, however, is 4 pixel, 4 vertex. So there are definately situations, even entire genres, that would be potentially slower on Tegra 2 compared to what a benchmark would say.
Sent from my Transformer TF101 using Tapatalk
I had bought Great Little War Game 2 weeks ago. But after few minutes of playing I canceled the purchase and refunded. Why? Because of poor optimization for Asus.
I have this game also on Ipad2. It runs smoothly and much better.
Orion66 said:
I had bought Great Little War Game 2 weeks ago. But after few minutes of playing I canceled the purchase and refunded. Why? Because of poor optimization for Asus.
I have this game also on Ipad2. It runs smoothly and much better.
Click to expand...
Click to collapse
generally not just the asus but all the tegra 2 tablets have poor performance for this game
So is the issue that the tegra 2 in some respects is slower or that developers need to write with the way the tegra 2 gpu works in mind?
Sent from my tf101 using xda premium 1.59Ghz
Who has been excited by the Tegra 4 rumours?, last night's Nvidia CES announcement was good, but what we really want are cold-hard BENCHMARKS.
I found an interesting mention of Tegra T114 SoC on a Linux Kernel site, which I've never heard of. I got really interested when it stated that the SoC is based on ARM A15 MP, it must be Tegra 4. I checked the background of the person who posted the kernel patch, he is a senior Nvidia Kernel engineer based in Finland.
https://lkml.org/lkml/2012/12/20/99
"This patchset adds initial support for the NVIDIA's new Tegra 114
SoC (T114) based on the ARM Cortex-A15 MP. It has the minimal support
to allow the kernel to boot up into shell console. This can be used as
a basis for adding other device drivers for this SoC. Currently there
are 2 evaluation boards available, "Dalmore" and "Pluto"."
On the off chance I decided to search www.glbenchmark.com for the 2 board names, Dalmore (a tasty whisky!) and Pluto (Planet, Greek God and cartoon dog!) Pluto returned nothing, but Dalmore returned a device called 'Dalmore Dalmore' that was posted on 3rd January 2013. However the OP had already deleted them, but thanks to Google Cache I found the results
RESULTS
GL_VENDOR NVIDIA Corporation
GL_VERSION OpenGL ES 2.0 17.01235
GL_RENDERER NVIDIA Tegra
From System spec, It runs Android 4.2.1, a Min frequency of 51 MHz and Max of 1836 Mhz
Nvidia DALMORE
GLBenchmark 2.5 Egypt HD C24Z16 - Offscreen (1080p) : 32.6 fps
iPad 4
GLBenchmark 2.5 Egypt HD C24Z16 - Offscreen (1080p): 49.6 fps
CONCLUSION
Anandtech has posted that Tegra 4 doesn't use unified shaders, so it's not based on Kepler. I reckon that if Nvidia had a brand new GPU they would have shouted about it at CES, the results I've found indicate that Tegra 4 is between 1 to 3 times faster than Tegra 3.
BUT, this is not 100% guaranteed to be a Tegra 4 system, but the evidence is strong that it is a T4 development board. If this is correct, we have to figure that it is running beta drivers, Nexus 10 is ~ 10% faster than the Arndale dev board with the same Exynos 5250 SoC. Even if Tegra 4 gets better drivers, it seems like the SGX 544 MP4 in the A6X is still the faster GPU, with Tegra 4 and Mali T604 being an almost equal 2nd. Nvidia has said that T4 is faster than A6X, but the devil is in the detail, in CPU benchmarks I can see that being true, but not for graphics.
UPDATE - Just to add to the feeling that that this legit, the GLBenchmark - System section lists the "android.os.Build.USER" as buildbrain. Buildbrain according to a Nvidia job posting is "Buildbrain is a mission-critical, multi-tier distributed computing system that performs mobile builds and automated tests each day, enabling NVIDIA's high performance development teams across the globe to develop and deliver NVIDIA's mobile product line"
http://jobsearch.naukri.com/job-lis...INEER-Nvidia-Corporation--2-to-4-130812500024
I posted the webcache links to GLBenchmark pages below, if they disappear from cache, I've saved a copy of the webpages, which I can upload, Enjoy
GL BENCHMARK - High Level
http://webcache.googleusercontent.c...p?D=Dalmore+Dalmore+&cd=1&hl=en&ct=clnk&gl=uk
GL BENCHMARK - Low Level
http://webcache.googleusercontent.c...e&testgroup=lowlevel&cd=1&hl=en&ct=clnk&gl=uk
GL BENCHMARK - GL CONFIG
http://webcache.googleusercontent.c...Dalmore&testgroup=gl&cd=1&hl=en&ct=clnk&gl=uk
GL BENCHMARK - EGL CONFIG
http://webcache.googleusercontent.c...almore&testgroup=egl&cd=1&hl=en&ct=clnk&gl=uk
GL BENCHMARK - SYSTEM
http://webcache.googleusercontent.c...ore&testgroup=system&cd=1&hl=en&ct=clnk&gl=uk
OFFSCREEN RESULTS
http://webcache.googleusercontent.c...enchmark.com+dalmore&cd=4&hl=en&ct=clnk&gl=uk
http://www.anandtech.com/show/6550/...00-5th-core-is-a15-28nm-hpm-ue-category-3-lte
Is there any Gpu that could outperform iPad4 before iPad5 comes out? adreno 320, t Mali 604 now tegra 4 aren't near it. Qualcomm won't release anything till q4 I guess, and tegra 4 has released too only thing that is left is I guess is t Mali 658 coming with exynos 5450 (doubtfully when it would release, not sure it will be better )
Looks like apple will hold the crown in future too .
i9100g user said:
Is there any Gpu that could outperform iPad4 before iPad5 comes out? adreno 320, t Mali 604 now tegra 4 aren't near it. Qualcomm won't release anything till q4 I guess, and tegra 4 has released too only thing that is left is I guess is t Mali 658 coming with exynos 5450 (doubtfully when it would release, not sure it will be better )
Looks like apple will hold the crown in future too .
Click to expand...
Click to collapse
There was a great article on Anandtech that tested the power consumption of the Nexus 10's Exynos 5250 SoC, it showed that both the CPU and GPU had a TDP of 4W, making a theoretical SoC TDP of 8W. However when the GPU was being stressed by running a game, they ran a CPU benchmark in the background, the SoC quickly went up to 8W, but the CPU was quickly throttled from 1.7 GHz to just 800 Mhz as the system tried to keep everything at 4W or below, this explained why the Nexus 10 didn't benchmark as well as we wished.
Back to the 5450 which should beat the A6X, trouble is it has double the CPU & GPU cores of the 5250 and is clocked higher, even on a more advanced 28nm process, which will lower power consumption I feel that system will often be throttled because of power and maybe heat concerns, so it looks amazing on paper but may disappoint in reality, and a 5450 in smartphone is going to suffer even more.
So why does Apple have an advantage?, well basically money, for a start mapple fans will pay more for their devices, so they afford to design a big SoC and big batteries that may not be profitable to other companies. Tegra 4 is listed as a 80mm2 chip, iPhone 5 is 96mm2 and A6X is 123mm2, Apple can pack more transistors and reap the GPU performance lead, also they chosen graphics supplier Imagination Technologies have excellent products, Power VR Rogue will only increase Apple's GPU lead. They now have their own chip design team, the benefit for them has been their Swift core is almost as powerful as ARM A15, but seems less power hungry, anyway Apple seems to be happy running slower CPUs compared to Android. Until an Android or WP8 or somebody can achieve Apple's margins they will be able to 'buy' their way to GPU domination, as an Android fan it makes me sad:crying:
32fps is no go...lets hope it's not final
hamdir said:
32fps is no go...lets hope it's not final
Click to expand...
Click to collapse
It needs to, but it will be OK for a new Nexus 7
still faster enough for me, I dont game alot on my nexus 7.
I know I'm taking about phones here ... But the iPhone 5 GPU and adreno 320 are very closely matched
Sent from my Nexus 4 using Tapatalk 2
italia0101 said:
I know I'm taking about phones here ... But the iPhone 5 GPU and adreno 320 are very closely matched
Sent from my Nexus 4 using Tapatalk 2
Click to expand...
Click to collapse
From what I remember the iPhone 5 and the new iPad wiped the floor with Nexus 4 and 10. The ST-Ericsson Nova A9600 is likely to have a PowerVR Rogue GPU. Just can't wait!!
adityak28 said:
From what I remember the iPhone 5 and the new iPad wiped the floor with Nexus 4 and 10. The ST-Ericsson Nova A9600 is likely to have a PowerVR Rogue GPU. Just can't wait!!
Click to expand...
Click to collapse
That isn't true , check glbenchmark , in the off screen test the iPhone scored 91 , the nexus 4 scored 88 ... That ksnt wiping my floors
Sent from my Nexus 10 using Tapatalk HD
Its interesting how even though nvidia chips arent the best we still get the best game graphics because of superior optimization through tegra zone. Not even the a6x is as fully optimized.
Sent from my SAMSUNG-SGH-I727 using xda premium
ian1 said:
Its interesting how even though nvidia chips arent the best we still get the best game graphics because of superior optimization through tegra zone. Not even the a6x is as fully optimized.
Sent from my SAMSUNG-SGH-I727 using xda premium
Click to expand...
Click to collapse
What sort of 'optimisation' do you mean? un optimised games lag that's a big letdown and tegra effects can also be used on other phones too with chain fire 3d I use it and tegra games work without lag with effects and I don't have a tegra device
With a tegra device I am restricted to optimised games mostly
The graphic performance of NVIDIA SoCs is always disappointed, sadly for the VGA dominanting provider on the world.
The first Tegra2, the GPU is a little bit better than SGX540 of GalaxyS a little bit in benchmark, but lacking NEON support.
The second one Tegra 3, the GPU is nearly the same as the old Mali400Mp4 in GALAXY S2/Original Note.
And now it's better but still nothing special and outperformed soon (Adreno 330 and next-gen Mali)
Strongest PowerVR GPUs are always the best, but sadly they are exclusive for Apple only (SGX543 and maybe SGX 554 also, only Sony ,who has the cross-licencing with Apple, has it in PS Vita and in PS Vita only)
tegra optimization porting no longer works using chainfire, this is now a myth
did u manage to try shadowgun thd, zombie driver or horn? the answer is no, games that use t3 sdk for physx and other cpu graphics works can not be forced to work on other devices, equally chainfire is now outdated and no longer updated
now about power vr they are only better in real multicore configuration which is only used by apple and Sony's vita, eating large die area, ie actual multicore each with its own subcores/shaders, if tegra was used in real multi core it would destroy all
finally this is really funny all this doom n gloom because of an early discarded development board benchmark, I dont mean to take away from turbo's thunder and his find but truly its ridiculous the amount of negativity its is collecting before any type of final device benchs
adrena 220 doubled in performance after the ICS update on sensation
t3 doubled the speed of t2 gpu with only 50% the number of shaders so how on earth do you believe only 2x the t3 scores with 600% more shaders!!
do you have any idea how miserable the ps3 performed in its early days? even new desktop GeForces perform much less than expected until the drivers are updated
enough with the FUD! seems this board is full of it nowadays and so little reasoning...
For goodness sake, this isn't final hardware, anything could change. Hung2900 knows nothing, what he stated isn't true. Samsung has licensed PowerVR, it isn't just stuck to Apple, just that Samsung prefers using ARMs GPU solution. Another thing I dislike is how everyone is comparing a GPU in the iPad 4 (SGX554MP4) that will NEVER arrive in a phone compared a Tegra 4 which will arrive in a phone. If you check OP link the benchmark was posted on the 3rd of January with different results (18fps then 33fps), so there is a chance it'll rival the iPad 4. I love Tegra as Nvidia is pushing developers to make more better games for Android compared to the 'geeks' *cough* who prefers benchmark results, whats the point of having a powerful GPU if the OEM isn't pushing developers to create enhance effect games for there chip.
Hamdir is correct about the GPUs, if Tegra 3 was around 50-80% faster than Tegra 2 with just 4 more cores, I can't really imagine it only being 2x faster than Tegra 3. Plus its a 28nm (at around 80mm2 just a bit bigger than Tegra 3, smaller than A6 90mm2) along with the dual memory than single on Tegra 2/3.
Turbotab said:
There was a great article on Anandtech that tested the power consumption of the Nexus 10's Exynos 5250 SoC, it showed that both the CPU and GPU had a TDP of 4W, making a theoretical SoC TDP of 8W. However when the GPU was being stressed by running a game, they ran a CPU benchmark in the background, the SoC quickly went up to 8W, but the CPU was quickly throttled from 1.7 GHz to just 800 Mhz as the system tried to keep everything at 4W or below, this explained why the Nexus 10 didn't benchmark as well as we wished.
Back to the 5450 which should beat the A6X, trouble is it has double the CPU & GPU cores of the 5250 and is clocked higher, even on a more advanced 28nm process, which will lower power consumption I feel that system will often be throttled because of power and maybe heat concerns, so it looks amazing on paper but may disappoint in reality, and a 5450 in smartphone is going to suffer even more.
So why does Apple have an advantage?, well basically money, for a start iSheep will pay more for their devices, so they afford to design a big SoC and big batteries that may not be profitable to other companies. Tegra 4 is listed as a 80mm2 chip, iPhone 5 is 96mm2 and A6X is 123mm2, Apple can pack more transistors and reap the GPU performance lead, also they chosen graphics supplier Imagination Technologies have excellent products, Power VR Rogue will only increase Apple's GPU lead. They now have their own chip design team, the benefit for them has been their Swift core is almost as powerful as ARM A15, but seems less power hungry, anyway Apple seems to be happy running slower CPUs compared to Android. Until an Android or WP8 or somebody can achieve Apple's margins they will be able to 'buy' their way to GPU domination, as an Android fan it makes me sad:crying:
Click to expand...
Click to collapse
Well said mate!
I can understand what you feel, nowdays android players like samsung,nvidia are focusing more on CPU than GPU.
If they won't stop soon and continued to use this strategy they will fail.
GPU will become bottleneck and you will not be able use the cpu at its full potential. (Atleast when gaming)
i have Galaxy S2 exynos 4 1.2Ghz and 400mhz oc mali gpu
In my analysis most modern games like MC4,NFS:MW aren't running at 60FPS at all thats because GPU always have 100% workload and CPU is relaxing there by outputing 50-70% of total CPU workload
I know some games aren't optimize for all android devices as opposed to apple devices but still even high-end android devices has slower gpu (than ipad 4 atleast )
AFAIK, Galaxy SIV is likely to pack T-604 with some tweaks instead of mighty T-658 which is still slower than iPAddle 4
Turbotab said:
There was a great article on Anandtech that tested the power consumption of the Nexus 10's Exynos 5250 SoC, it showed that both the CPU and GPU had a TDP of 4W, making a theoretical SoC TDP of 8W. However when the GPU was being stressed by running a game, they ran a CPU benchmark in the background, the SoC quickly went up to 8W, but the CPU was quickly throttled from 1.7 GHz to just 800 Mhz as the system tried to keep everything at 4W or below, this explained why the Nexus 10 didn't benchmark as well as we wished.
Back to the 5450 which should beat the A6X, trouble is it has double the CPU & GPU cores of the 5250 and is clocked higher, even on a more advanced 28nm process, which will lower power consumption I feel that system will often be throttled because of power and maybe heat concerns, so it looks amazing on paper but may disappoint in reality, and a 5450 in smartphone is going to suffer even more.
So why does Apple have an advantage?, well basically money, for a start iSheep will pay more for their devices, so they afford to design a big SoC and big batteries that may not be profitable to other companies. Tegra 4 is listed as a 80mm2 chip, iPhone 5 is 96mm2 and A6X is 123mm2, Apple can pack more transistors and reap the GPU performance lead, also they chosen graphics supplier Imagination Technologies have excellent products, Power VR Rogue will only increase Apple's GPU lead. They now have their own chip design team, the benefit for them has been their Swift core is almost as powerful as ARM A15, but seems less power hungry, anyway Apple seems to be happy running slower CPUs compared to Android. Until an Android or WP8 or somebody can achieve Apple's margins they will be able to 'buy' their way to GPU domination, as an Android fan it makes me sad:crying:
Click to expand...
Click to collapse
Typical "isheep" reference, unnecessary.
Why does apple have the advantage? Maybe because there semiconductor team is talented and can tie the A6X+PowerVR GPU efficiently. NIVIDA should have focused more on GPU in my opinion as the CPU was already good enough. With these tablets pushing excess of 250+ppi the graphics processor will play a huge role. They put 72 cores in there processor. Excellent. Will the chip ever be optimized to full potential? No. So again they demonstrated a product that sounds good on paper but real world performance might be a different story.
MrPhilo said:
For goodness sake, this isn't final hardware, anything could change. Hung2900 knows nothing, what he stated isn't true. Samsung has licensed PowerVR, it isn't just stuck to Apple, just that Samsung prefers using ARMs GPU solution. Another thing I dislike is how everyone is comparing a GPU in the iPad 4 (SGX554MP4) that will NEVER arrive in a phone compared a Tegra 4 which will arrive in a phone. If you check OP link the benchmark was posted on the 3rd of January with different results (18fps then 33fps), so there is a chance it'll rival the iPad 4. I love Tegra as Nvidia is pushing developers to make more better games for Android compared to the 'geeks' *cough* who prefers benchmark results, whats the point of having a powerful GPU if the OEM isn't pushing developers to create enhance effect games for there chip.
Hamdir is correct about the GPUs, if Tegra 3 was around 50-80% faster than Tegra 2 with just 4 more cores, I can't really imagine it only being 2x faster than Tegra 3. Plus its a 28nm (at around 80mm2 just a bit bigger than Tegra 3, smaller than A6 90mm2) along with the dual memory than single on Tegra 2/3.
Click to expand...
Click to collapse
Firstly please keep it civil, don't go around saying that people know nothing, people's posts always speak volumes. Also calling people geeks, on XDA is that even an insult, next you're be asking what I deadlift:laugh:
My OP was done in the spirit of technical curiosity, and to counter the typical unrealistic expectations of a new product on mainstream sites, e.g. Nvidia will use Kepler tech (which was false), omg Kepler is like GTX 680, Tegra 4 will own the world, people forget that we are still talking about device that can only use a few watts, and must be passively cooled and not a 200+ watt, dual-fan GPU, even though they both now have to power similar resolutions, which is mental.
I both agree and disagree with your view on Nvidia's developer relationship, THD games do look nice, I compared Infinity Blade 2 on iOS vs Dead Trigger 2 on youtube, and Dead Trigger 2 just looked richer, more particle & physics effects, although IF Blade looked sharper at iPad 4 native resolution, one of the few titles to use the A6x's GPU fully.The downside to this relationship is the further fragmentation of the Android ecosystem, as Chainfire's app showed most of the extra effects can run on non Tegra devices.
Now, a 6 times increase in shader, does not automatically mean that games / benchmarks will scale in linear fashion, as other factors such as TMU /ROP throughput can bottleneck performance. Nvidia's Technical Marketing Manager, when interviewed at CES, said that the overall improvement in games / benchmarks will be around 3 to 4 times T3. Ultimately I hope to see Tegra 4 in a new Nexus 7, and if these benchmarks are proved accurate, it wouldn't stop me buying. Overall including the CPU, it would be a massive upgrade over the current N7, all in the space of a year.
At 50 seconds onwards.
https://www.youtube.com/watch?v=iC7A5AmTPi0
iOSecure said:
Typical "isheep" reference, unnecessary.
Why does apple have the advantage? Maybe because there semiconductor team is talented and can tie the A6X+PowerVR GPU efficiently. NIVIDA should have focused more on GPU in my opinion as the CPU was already good enough. With these tablets pushing excess of 250+ppi the graphics processor will play a huge role. They put 72 cores in there processor. Excellent. Will the chip ever be optimized to full potential? No. So again they demonstrated a product that sounds good on paper but real world performance might be a different story.
Click to expand...
Click to collapse
Sorry Steve, this is an Android forum, or where you too busy buffing the scratches out of your iPhone 5 to notice? I have full respect for the talents of Apple's engineers & marketing department, many of its users less so.
hamdir said:
tegra optimization porting no longer works using chainfire, this is now a myth
did u manage to try shadowgun thd, zombie driver or horn? the answer is no, games that use t3 sdk for physx and other cpu graphics works can not be forced to work on other devices, equally chainfire is now outdated and no longer updated
Click to expand...
Click to collapse
Looks like they haven't updated chain fire 3d for a while as a result only t3 games don't work but others do work rip tide gp, dead trigger etc . It's not a myth but it is outdated and only works with ics and tegra 2 compatible games . I think I (might be) unfortunate too but some gameloft games lagged on tegra device that i had, though root solved it too an extent
I am not saying something is superior to something just that my personal experience I might be wrong I may not be
Tbh I think benchmarks don't matter much unless you see some difference in real world usage and I had that problem with tegra in my experience
But we will have to see if the final version is able to push it above Mali t 604 and more importantly sgx544
Turbotab said:
Sorry Steve, this is an Android forum, or where you too busy buffing the scratches out of your iPhone 5 to notice? I have full respect for the talents of Apple's engineers & marketing department, many of its users less so.
Click to expand...
Click to collapse
No I actually own a nexus 4 and ipad mini so I'm pretty neutral in googles/apples ecosystems and not wiping any scratches off my devices.
Just to get it out there, I'm not a complete idiot, obviously this is debatable due to architectural differences and what have you but I think we're at the stage where modern phones can hold their own against the average PC.
For example my laptop (AMD-E450 1.65GHz dual/Radeon 6850/4GB RAM) plays 1080p video with a lot to be desired. YouTube 1080p plays at low fps, and local 1080p is just straight up lag.
A lot of our devices today can breeze through such tasks without breaking a sweat and we take it for granted!
What do you think? Is your smartphone 'faster' than your PC?
I was thinking the exact same thing the other day, while I agree with the different architectures between phone and PC, my phones blow it to hell as far as YouTube is concerned. Same as you really, sluggish lag on PC when handling 1080p but on my smaller devices it's a breeze. And my pc is not low end nor high end by any means.
Sometimes my Smartphone is faster even than my PC, who has a AMD Phenom II X4 955 BE processor, 4GB of RAM. But of course I can't complain too much because I am on my PC one of heavy users, and on my smartphone - not. And on the future perhaps I will upgrade my PC (especially RAM) :silly:
When comparing raw performance, x86 will almost always beat out ARM (the type of processor in your phone). When comparing performance per watt, ARM beats x86 -- that's why your phone usually outlasts your laptop.
However, there are a few other different reasons for the gap in performance.
1 - Hardware decoders. Most smartphones, in order to save CPU power and wattage, include video decoders in the hardware. That way, whenever you want to watch a YouTube video, the dedicated chip takes care of the processing and CPU usage remains minimal. On the PC side, hardware video decoders usually only appear in some types of video cards. Chances are, if you have a cheap, $349 notebook, you've got minimal processing power to start with, and the CPU gets stuck with everything.
2 - Software. If you took the same computer you have today and put a Linux distribution on it, you'd probably get better raw performance than Windows with the usual Windows overhead + PC maker crapware + spyware infections + whatever other applications are in the background combination I usually see. I say "probably better" because there are always PCs with hardware that isn't fully supported yet, which causes performance issues particularly when it comes to video hardware.
i have been thinking of this for some time now. but compared to my computer, my computer is way faster with an intel i7 2700k and a radeon hd 7870 but compared to some laptops and low end desktops our phones can handle just about the same thing that they can handle. our phones can play some nice graphic games like real racing 3. im sure that game would make some low end computers lag at playing that game
not fast but i can play amazing spiderman and the dark knight in my phone
but cant play it in my pc :/
I have no idea, as Android still can't run Windows games. (What I wouldnt give to be able to run Skyrim on my tab... Splashtop is useless without wifi.) But the blame for that lies with game developers, not Google or Android itself.
I need to replace the gpu in my laptop as it's starting to become a bottleneck, (gt 130m), but other than, my laptop is probably faster.
Also, screen resolution on my phone is WAY lower than my computer, so it's a seriously unfair comparison, lower resolution always means higher framerate, as it has to render less.
On an 1280x800 screen, it only has to render 74% of a Full HD video, as opposed to the full 100% (and more) on a 1920x1400 screen.
Send From My Samsung Galaxy S3 Using Tapatalk 2
yeah , my pc is just like a slow runner(need to find new one !!!!) , my new xperia is just like ALIENWARE !! Thanks SONY Corporation
I think PC is faster. My alienware beats the s**t out of my note 2 anyday.
My PC is faster.
I can certainly see that this is not the case for everyone. My friend's laptop can't run a DS emulator very well, my phone can at about 2/3 speed, and my PC can at full speed.
My PC can run both an HD Game of Thrones rip (we own it on Blu Ray) and run the Dolphin Game Cube emulator playing Twilight Princess at full speed - at the same time.
And it's not even that great a PC - it's just a mid-range gaming set-up that I built because I wanted to play GW2 and Skyrim (not the most demanding games) at maximum shininess.
Did some googling and it appears today's phone CPUs would be equal to about a Pentium D computationally.
You need a few more generations to really close the gap. Any perceptions of them being faster is based on the overall architecture of the phones.
Sent from my SCH-I605 using xda premium
PC faster, my galaxy mini is slow, galaxy w is normal
Sent from my GT-I8150 using xda app-developers app
Additions from a stranger
crayz9000 said:
When comparing raw performance, x86 will almost always beat out ARM (the type of processor in your phone). When comparing performance per watt, ARM beats x86 -- that's why your phone usually outlasts your laptop.
However, there are a few other different reasons for the gap in performance.
1 - Hardware decoders. Most smartphones, in order to save CPU power and wattage, include video decoders in the hardware. That way, whenever you want to watch a YouTube video, the dedicated chip takes care of the processing and CPU usage remains minimal. On the PC side, hardware video decoders usually only appear in some types of video cards. Chances are, if you have a cheap, $349 notebook, you've got minimal processing power to start with, and the CPU gets stuck with everything.
2 - Software. If you took the same computer you have today and put a Linux distribution on it, you'd probably get better raw performance than Windows with the usual Windows overhead + PC maker crapware + spyware infections + whatever other applications are in the background combination I usually see. I say "probably better" because there are always PCs with hardware that isn't fully supported yet, which causes performance issues particularly when it comes to video hardware.
Click to expand...
Click to collapse
to add to what this poster has already stated:
3 - Possessor Channels. They are kinda like interstate highways for programming languages that pass information that is recognized as a supported without a whole lot of extra emulation or superfluous handling. Apparently ARM has many more channels than IBM or AMD; google it and be surprised.
4 - Openstack. Why compare differences and get all caught up with what can and can't be done on an individual hardware set-up(s), instead lets join them all together into one virtual machine and never worry about speed or ram again.. this is what I'm working on and there is promise that one day we will all be able to run any program or operating system with any collection of old hardware.
my HOX now with viperx 3.6 is iqual in speed to my laptop.(lenovo p4)
My pc still faster
Sent from jamban umum.
Yeah I agree my HTC is faster than my old P4 Windows XP.
Well, yes. Nexus One 1GHZ 512MB RAM vs Athon 850Mhz 364MB RAM
My htc desire plays 720p 10x faster than my P4 desktop. It's a real shame, my mother paid 3 grand for that HP back in 2004. Look at the progression of technology...
It's a shame that my phone faster than my PC ,, but thanks to my PC i can finish my assignments faster.. can i use phone to do it ?LoL :what:
Sent from my E15i using xda premium
pc: 4gb ram dragon, ati radeon hd 4870, amd phenom quadcore, disk Samsung 1tb with linux mint 14 nadia xfce enviroment. vs galaxy s3 i9300
Enviado desde mi GT-I9300 usando Tapatalk 2
can someone post the best and the fastest version of ppsspp ??
and can it run heavy games cause i downloaded God of war chain of olympus but it is very slow and show fps not exceed 10 mostly 5-10 fps
makes the game unplayable
or this is a hardware problem
my phone is LG Optimus p920 3D
1 GHz dual core processor
512 dual channel RAM
PoweVR SGX540 (GPU)
i saw a video on youtube show pes2013 on galaxy s2 with fps 60
i think psp processor 333MHz and that's lower than dual core 1 GHz
The problem is the RAM. Also, MIPS is one ugly RISC processor, hence PS2 emulator have to be run on a personal supercomputer (four core 64-bit x86 CPU - either Phenom II or Core i7, GeForce 9800 / Radeon HD 4670 and 2GB RAM being the minimum), so it's one of the reason why it doesn't end pretty. And you will have to continuously tweak the plugins to achieve the best rate (I had to do so on my PC for a week to get 50+ fps on PS2 emulator on my PC).
EDIT: 333MHz MIPS CPU clock is correct, and there is 32MB DDR II memory in it - I am going to shoot for 533 MHz clock (266 MHz base FSB clock). Yet, it doesn't help anything a bit - I suspect PSP CPU is a superscalar in-order MIPS-6000 (?) processor, quite similar to 299 MHz Emotion Engine (custom superscalar MIPS CPU) inside the PS2, so that requires four threads out of superscalar pipelines, and a superscalar vector FPU.
Sent from my LG-P920 using xda premium
i don't understand
what is MIPS and what is RISC processor?
with overclocking my phone i get 20-25 fps doesn't reach 60 or even 30
although after overclock RAM score in Antutu Benchmark 1260 nearly the same like galaxy S2 but galaxy s2 get 60 fps on ppsspp !!!!
here is the video http://www.youtube.com/watch?v=gYx6pi24wdw
i don't think it is a problem of RAM
mktns said:
i don't understand
what is MIPS and what is RISC processor?
with overclocking my phone i get 20-25 fps doesn't reach 60 or even 30
although after overclock RAM score in Antutu Benchmark 1260 nearly the same like galaxy S2 but galaxy s2 get 60 fps on ppsspp !!!!
here is the video http://www.youtube.com/watch?v=gYx6pi24wdw
i don't think it is a problem of RAM
Click to expand...
Click to collapse
Google it.
You don't know how Playstation systems work.
The hardware there is a little bit different.
MIPS is an ISA for the specific processor, and RISC = Reduced Instruction Set Computing, basically a simple processor, determined only by the registers inside the CPU's on-die cache RAMs. Your phone have a RISC processor also - ARM (Advanced RISC Machine).
The reason I said MIPS is ugly is because it have few oddball registers which usually break the Floating-Point expectation in other processors (Phenom II is based on PowerPC-like RISC engine but still), so they have to work significantly more than once just to get it right. Also, RAM is still a problem - I know of Sony's rather strange habit of using surprisingly little RAM memory: They used modified version of compcache, basically compressing the entire game instructions into small RAM vector slices.
And why is the RAM a problem? You guess it: MIPS virtual machine eats lot of memory that PS2 emulator eats 400 MB just to run the game on my PC, at decent framerate. Now granted, PSP is basically a portable version of PS2. And Galaxy S II have 1 GB RAM - I know because I have Captivate Glide, basically a S II with built-in keyboard (need to get new screen).
Sent from my LG-P920 using xda premium
Dr. Mario said:
MIPS is an ISA for the specific processor, and RISC = Reduced Instruction Set Computing, basically a simple processor, determined only by the registers inside the CPU's on-die cache RAMs. Your phone have a RISC processor also - ARM (Advanced RISC Machine).
The reason I said MIPS is ugly is because it have few oddball registers which usually break the Floating-Point expectation in other processors (Phenom II is based on PowerPC-like RISC engine but still), so they have to work significantly more than once just to get it right. Also, RAM is still a problem - I know of Sony's rather strange habit of using surprisingly little RAM memory: They used modified version of compcache, basically compressing the entire game instructions into small RAM vector slices.
And why is the RAM a problem? You guess it: MIPS virtual machine eats lot of memory that PS2 emulator eats 400 MB just to run the game on my PC, at decent framerate. Now granted, PSP is basically a portable version of PS2. And Galaxy S II have 1 GB RAM - I know because I have Captivate Glide, basically a S II with built-in keyboard (need to get new screen).
Sent from my LG-P920 using xda premium
Click to expand...
Click to collapse
In a few words, game developers make their games to use the best of the Playstation systems, that way changing the whole way of running the game so it will run perfectly on the Playstation systems. This leading to incompatibility with our normal hardware, and the actual emulation requires lots of RAM. Remember you are emulating the whole Playstation system you don't have the hardware parts that will run the game at as it will run on the Playstation systems.
I think I am right, if not Dr. Mario please, correct me.
Yes, that's correct. Not to mention you have to remember PS2 and PSP both have in-order processor, basically straddling along with the time-slice kernel in the BIOS and the entire firmware both unless replaced with what's on the disc. Time-keeping is just tricky on out-of-order processors, thankfully they have special register counters inside them, which the emulator uses.
EDIT: The usage of strange but rather special RTOS complicates matter in emulated virtual machine a bit - what if you time the entire symphony of processor threads wrong? A lot of strange things will happen: 1. It will simply do nothing (even more likely) 2. You won't be able to play games. 3. You get weird graphic glitches (ditto for #1). 4. You fry your TV or the PS2 (a lot less likely) / your phone (possible - if it freezes, better turn it off quickly). To get correct framerate, you have to time the vector threads against the out-of-order processor's operating frequency, to be divided down to the real clock of MIPS VM - 3 GHz : 10 = 300 MHz for Phenom II / 1 GHz : 3 = 333 MHz for ARM Cortex A9 and to keep it in order (Vector FPU seems to be the special case - why didn't Sony think of that? VUs can execute out-of-order, it just have to crunch numbers independent of CPU threads).
Sent from my LG-P920 using xda premium
thnx anyway guys
although i don't understand most of Dr.Mario words
it seems that you are a programmer to understand these things
Yeah, and hardware designer (electrical engineer to be precise) too, so I basically have good concept of how that works.
Sent from my LG-P920 using xda premium