[Q] help with PPSSPP?? - LG Optimus 3D

can someone post the best and the fastest version of ppsspp ??
and can it run heavy games cause i downloaded God of war chain of olympus but it is very slow and show fps not exceed 10 mostly 5-10 fps
makes the game unplayable
or this is a hardware problem
my phone is LG Optimus p920 3D
1 GHz dual core processor
512 dual channel RAM
PoweVR SGX540 (GPU)
i saw a video on youtube show pes2013 on galaxy s2 with fps 60
i think psp processor 333MHz and that's lower than dual core 1 GHz

The problem is the RAM. Also, MIPS is one ugly RISC processor, hence PS2 emulator have to be run on a personal supercomputer (four core 64-bit x86 CPU - either Phenom II or Core i7, GeForce 9800 / Radeon HD 4670 and 2GB RAM being the minimum), so it's one of the reason why it doesn't end pretty. And you will have to continuously tweak the plugins to achieve the best rate (I had to do so on my PC for a week to get 50+ fps on PS2 emulator on my PC).
EDIT: 333MHz MIPS CPU clock is correct, and there is 32MB DDR II memory in it - I am going to shoot for 533 MHz clock (266 MHz base FSB clock). Yet, it doesn't help anything a bit - I suspect PSP CPU is a superscalar in-order MIPS-6000 (?) processor, quite similar to 299 MHz Emotion Engine (custom superscalar MIPS CPU) inside the PS2, so that requires four threads out of superscalar pipelines, and a superscalar vector FPU.
Sent from my LG-P920 using xda premium

i don't understand
what is MIPS and what is RISC processor?
with overclocking my phone i get 20-25 fps doesn't reach 60 or even 30
although after overclock RAM score in Antutu Benchmark 1260 nearly the same like galaxy S2 but galaxy s2 get 60 fps on ppsspp !!!!
here is the video http://www.youtube.com/watch?v=gYx6pi24wdw
i don't think it is a problem of RAM

mktns said:
i don't understand
what is MIPS and what is RISC processor?
with overclocking my phone i get 20-25 fps doesn't reach 60 or even 30
although after overclock RAM score in Antutu Benchmark 1260 nearly the same like galaxy S2 but galaxy s2 get 60 fps on ppsspp !!!!
here is the video http://www.youtube.com/watch?v=gYx6pi24wdw
i don't think it is a problem of RAM
Click to expand...
Click to collapse
Google it.
You don't know how Playstation systems work.
The hardware there is a little bit different.

MIPS is an ISA for the specific processor, and RISC = Reduced Instruction Set Computing, basically a simple processor, determined only by the registers inside the CPU's on-die cache RAMs. Your phone have a RISC processor also - ARM (Advanced RISC Machine).
The reason I said MIPS is ugly is because it have few oddball registers which usually break the Floating-Point expectation in other processors (Phenom II is based on PowerPC-like RISC engine but still), so they have to work significantly more than once just to get it right. Also, RAM is still a problem - I know of Sony's rather strange habit of using surprisingly little RAM memory: They used modified version of compcache, basically compressing the entire game instructions into small RAM vector slices.
And why is the RAM a problem? You guess it: MIPS virtual machine eats lot of memory that PS2 emulator eats 400 MB just to run the game on my PC, at decent framerate. Now granted, PSP is basically a portable version of PS2. And Galaxy S II have 1 GB RAM - I know because I have Captivate Glide, basically a S II with built-in keyboard (need to get new screen).
Sent from my LG-P920 using xda premium

Dr. Mario said:
MIPS is an ISA for the specific processor, and RISC = Reduced Instruction Set Computing, basically a simple processor, determined only by the registers inside the CPU's on-die cache RAMs. Your phone have a RISC processor also - ARM (Advanced RISC Machine).
The reason I said MIPS is ugly is because it have few oddball registers which usually break the Floating-Point expectation in other processors (Phenom II is based on PowerPC-like RISC engine but still), so they have to work significantly more than once just to get it right. Also, RAM is still a problem - I know of Sony's rather strange habit of using surprisingly little RAM memory: They used modified version of compcache, basically compressing the entire game instructions into small RAM vector slices.
And why is the RAM a problem? You guess it: MIPS virtual machine eats lot of memory that PS2 emulator eats 400 MB just to run the game on my PC, at decent framerate. Now granted, PSP is basically a portable version of PS2. And Galaxy S II have 1 GB RAM - I know because I have Captivate Glide, basically a S II with built-in keyboard (need to get new screen).
Sent from my LG-P920 using xda premium
Click to expand...
Click to collapse
In a few words, game developers make their games to use the best of the Playstation systems, that way changing the whole way of running the game so it will run perfectly on the Playstation systems. This leading to incompatibility with our normal hardware, and the actual emulation requires lots of RAM. Remember you are emulating the whole Playstation system you don't have the hardware parts that will run the game at as it will run on the Playstation systems.
I think I am right, if not Dr. Mario please, correct me.

Yes, that's correct. Not to mention you have to remember PS2 and PSP both have in-order processor, basically straddling along with the time-slice kernel in the BIOS and the entire firmware both unless replaced with what's on the disc. Time-keeping is just tricky on out-of-order processors, thankfully they have special register counters inside them, which the emulator uses.
EDIT: The usage of strange but rather special RTOS complicates matter in emulated virtual machine a bit - what if you time the entire symphony of processor threads wrong? A lot of strange things will happen: 1. It will simply do nothing (even more likely) 2. You won't be able to play games. 3. You get weird graphic glitches (ditto for #1). 4. You fry your TV or the PS2 (a lot less likely) / your phone (possible - if it freezes, better turn it off quickly). To get correct framerate, you have to time the vector threads against the out-of-order processor's operating frequency, to be divided down to the real clock of MIPS VM - 3 GHz : 10 = 300 MHz for Phenom II / 1 GHz : 3 = 333 MHz for ARM Cortex A9 and to keep it in order (Vector FPU seems to be the special case - why didn't Sony think of that? VUs can execute out-of-order, it just have to crunch numbers independent of CPU threads).
Sent from my LG-P920 using xda premium

thnx anyway guys
although i don't understand most of Dr.Mario words
it seems that you are a programmer to understand these things

Yeah, and hardware designer (electrical engineer to be precise) too, so I basically have good concept of how that works.
Sent from my LG-P920 using xda premium

Related

Tegra or Snapdragon

Hi everybody, I just have some questions.
I plan to change my HTC Hermes next year but I don't know which based-device will be the best...
Snapdragon or Tegra.
Tegra seems to have 8core of execution for great graphics but not a big frequency(600-800Mhz). Snapdragon got the Ghz and is supposed to reach 1.3Ghz in 2010. There is also a dual core snapdragon 2x1.5Ghz supposed to be available this year but will it be for smartphones?
These are the questions I have because a PDA is a lot of money for me and I wanna choose the right device...
Thanks
Well snapdragon is multi core SoC just like Tegra but what nvidia is so proud of is power island. It means that they can shut off unneeded module(ex. turn off all modules except of modem when in standby). Tegra uses ARM11 CPU where snapdragon is based on improved cortex A8 besides it is clocked at 1Ghz so tegra can't win this one. GPU is better on tegra and probably video performance is better too but when it comes to brute force snapdragon wins hands down.
I think that is all you need to know about tegra and snapdragon. About that 2x1,5Ghz snapdragon it is designed to be used on smartbooks. It would be an overkill for smartphone at least for now.
Thanks that's all I wanted to know
also a Mhz is not just a Mhz
first of all a qualcomm mhz could mean more or less performance boots then a OMAP mhz
not to mention it don't really matter if the cpu is super fast if the ram and storage and other IO of a device can't keep up
joplayer said:
Tegra seems to have 8core of execution for great graphics but not a big frequency(600-800Mhz). Snapdragon got the Ghz and is supposed to reach 1.3Ghz in 2010.
Click to expand...
Click to collapse
Tegra is just like the Snapdragon a SoC. If we use the same logic that Nvidia used, then the Snapdragon is also a multi core SoC ( CPU, GPU, DSP, ... ). But its just marketing to make it look to people that they get a 8 Cpu system
Like Wishmaster89 pointed out, there is a major difference between the CPU's used on both system.
The 600Mhz Arm11 ( ArmV6 ) on the Tegra is capable off executing, about 1/3th what the Snapdragon's ArmV7 1Ghz Cpu can do.
The GPU on the other hand, is more powerful in the Tegra. There is a little list being used to compare the overall ( theoretical ) strengths off each platform's GPU
Nintendo DS: 120,000 triangles/s, 30 M pixels/s
PowerVR MBX-Lite (iPhone 3G): 1 M triangles/s, 100 M pixels/s
Samsung S3C6410 (Omnia II): 4 M triangles/s, 125.6 M pixels/s
ATI Imageon (Qualcomm MSM72xx): 4 M triangles/s, 133 M pixels/s
PowerVR SGX 530 (Palm Pre): 14 M triangles/s, ___ M pixels/s
ATI Imageon Z430 (Toshiba TG01): 22 M triangles/s, 133 M pixels/s
PowerVR SGX 535 (iPhone 3GS): 28 M triangles/s, 400 M pixels/s
Sony PSP: 33 M triangles/s, 664 M pixels/s
PowerVR SGX 540 (TI OMAP4): 35 M triangles/s, 1000 M pixels/s
Nvidia Tegra APX2500 (Zune HD): 40 M triangles/s, 600 M pixels/s
ATI Imageon _ (Qualcomm QSD8672): 80 M triangles/s, >500 M pixels/s
Click to expand...
Click to collapse
So, the Tegra's GPU is about twice as powerful as the Snapdragon's ATI Z430 ( looking at Triangles ). The reason why i use the term theoretically is because a lot off factors can make or break a GPU ( many more then on a CPU ). Bad drivers, bandwidth limitations, to little memory, bad mix off texture units, vertex units etc..
Problem with Nvidia is, they have always had the habit off exaggerating things ( a lesson learned more then a few times in the past ).
Another problem is, are the GPU's actually being used on the PDA/Smartphone's? A lesson i learned in the past from the x50v, with its own dedicated powerful ( in that time ) 2700g ( 800.000 Triangles in that time ). The reality is, most applications rely the most on the CPU.
At best, if you have dedicated games, written for the PDA/Smartphone market, very few will tap in to all the power that the Tegra has to offer.
Even the PSX Emulators ( who run great ( full speed 50/60fps pal/ntsc games ) ) on the Snapdragon. Forget about running a lot off psx games on a Arm11 without tweaking ( and frame skipping ). Because it relies the most on brute force cpu power ( and this is where the Snapdragon shines ).
So? What is there besides games? Video playback? Sure... The Tegra can supposedly do 1080p, while the TI OMAP & Snapdragon's only do 720p. But from what i have read, its more to the DSP that does the work. The snapdragon's DSP runs at 600Mhz, i don't find any information about the Tegra's DSP? Does it even have any? Anybody with more info how they even handle things?
When it comes down to PDA/Smartphone's... take it from me. The most important thing is first the CPU. Then the amount off memory ( and memory speed ). Then the GPU.
Lets just say i like to see a fair comparison between both systems, to see there real power ( and not some nvidia fake PR where a lot off people still fall in ).
Like i said, i don't exactly trust Nvidia's numbers when there PR posts crap like this:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Those numbers are what you can call a pure lie. When people from the OpenPandora project ( what uses a TI Omap3630 @ 600Mhz, with a slower GPU ), is able to run quake3 at 35+ fps... Yet, Nvidia claims 5fps for the Snapdragon, thats actually more powerful then the TI Imap3630... I love those little [*] next to the text... Small text below: "* NVIDIA estimates". In other words, how much trust can somebody place in the specs from a company that that pulls stunts like that.
Also... Snapdragon is used in the following smartphones that i know off: Toshiba TG01, Asus F1 ( S200 ), HTC HD2 ( Leo ), and a few more that are on the way. Where is the Tegra? The MS Zune... Thats it...
You think that HTC, Toshiba, Asus will all have looked at the different available SOC providers ( TI, qualcomm, Samsung, Nvidia etc ). Yet ... Who do they pick for there new top off the line products...
I hope this helps...
OP, therw isn't much to add after all that expert info, but I can make it easy for you. SD = raw power, Tegra = fancy graphics. I prefer power, because of the better overall performance.
as i see it the tegra chip has 2 600mhz cores + 6 other cores to do video, audio etc.
so a 1ghz snapdragon would have to split it mhz to deal with any audio, video etc whilst the tegra chip would have separate cores dealing with this stuff leaving 2 600mhz cores free.
this would make tegra a lot faster than snapdragon.
one thing which would be interesting would be batt life
in various situations
and excluding the atom as it's not really a phone cpu
one thing of note is that every snapdragon phone, although seems fast still has the standard wm lag at times (probably more wm that the cpu).
whilst the zune hd looks super smooth and very fast.
we will have to wait for the first tegra wm phone to see if it has the wm lag as its hard to tell by comparing a mp3/4 player (which has a os which was probably made from the ground up to run on the chip) to a phone.
Ganondolf said:
as i see it the tegra chip has 2 600mhz cores + 6 other cores to do video, audio etc.
so a 1ghz snapdragon would have to split it mhz to deal with any audio, video etc whilst the tegra chip would have separate cores dealing with this stuff leaving 2 600mhz cores free.
this would make tegra a lot faster than snapdragon.
Click to expand...
Click to collapse
You're completely wrong! As I said both are multi core SoC's. Both snapdragon and tegra have separate cores for video and audio! The only difference is that tegra can shut off unneeded module where snapdragon can't. Besides they know that their CPU is slow so they have to give people something that will make them forget about CPU so they decided that talking about 8 cores on something as small as their SoC would be a good choice.
As I said before raw CPU power of snapdragon is at least 3x greater than tegra and zune HD is smoother because all the work is done on the GPU(besides the whole Zune OS 4.0 was probably designed on tegra so don't expect it to lag) where WM is only CPU driven. Besides wait for HTC Leo to see almost lag free device(show me device that never lags).
For the last time. For know tegra has slow CPU where Snapdragon has a beast for CPU. Things should change with tegra2 and snapdragon2.
Ganondolf said:
as i see it the tegra chip has 2 600mhz cores + 6 other cores to do video, audio etc.
so a 1ghz snapdragon would have to split it mhz to deal with any audio, video etc whilst the tegra chip would have separate cores dealing with this stuff leaving 2 600mhz cores free.
this would make tegra a lot faster than snapdragon.
Click to expand...
Click to collapse
*uch* So much misinformation... I may not be a expert, but you just claimed that the Snapdragon needs to split its mhz, to do ... video? Did you even read that snapdragon's specs. Dedicated ... GPU. GPU = Video!
Another wrong point, is that both cores are not at 600Mhz. One core is at 600Mhz, and one Core is at 400Mhz. The 600Mhz core is a ARM11 core, and the 400Mhz, is a Arm7 core ( not to be confused with the ArmV7 aka Cortex A8 ).
The basic idea is, when a phone is in standby, that the 400Mhz Arm 7 core, does the basic staying alive stuff. Where as the 600 Arm11 core, is only used for the big stuff. The basic idea is good.
But, the Snapdragon 1Ghz ArmV7 Cpu is able to downscale, and reduce its power footprint also. What solution is the better one ... We will needs to see.
To put things in perspective:
Tegra:
* ARM 11
* ARM 7
* GPU
* 2D Engine
* HD Video Encoder
* HD Video Decoder
* Audio
* Imaging
Snapdragon
* ARM v7 ( Cortex A8 )
* GPU
* DSP
* HD Video Decoder
* ...
Now... You will say. Hey, look at all those extra cores that the Tegra has. Must be a power house... No ... It does not work like that.
The Snapdragon's 600Mhz DSP has several capabilities, including dedicated Image processing, etc. The question is, how fast is the Image processor for the Tegra? If its a separate core, it has its own frequency. This alone make a big difference, because the slow that core, the longer it takes to do the job ( and the more power drain ).
The 600Mhz Tegra that we are comparing here, has only a 720p output capability. Just like the Snapdragon. As far as i can tell, the Tegra 600 is used in the Zune. Something tells me that the Tegra 650 is more for notebooks.
HD Encoding / HD decoding. By any definition, that is part off the GPU. Just like the ATI Z430 has its own dedicated HD capabilities. And any GPU these days has the ability to disable part off its to save power. So we can assume that the same capability is in the mobile variant. The Z430 is based on the GPU found in the x360. It has its own HD, audio, media, etc processing capabilites ( aka, if you like to call it in Nvidia's term... HD, Audio, Media Core's ).
So, from a technical point of view, the Snapdragon has also 8 cores. Hell, we can trump that, because the DSP is capable off more then just Image processing. So, how many extra cores can be gain from that?
To be honest, there is so much misinformation that people jump on... Its actually kinda incredible ( and frightening )... While i need to admit, when looking at the Google links, Nvidia did a good job at spreading the FUBAR information. Most sites took over the information, without questioning it one little bit...
Lag?
And Ganondolf regarding the lag that you report? To be honest, i have shown several movies to a friend with WM6.5 + Touchflow backported on older HTC devices ( devices with the same slow cpu's, like the Tegra uses ). Guess what... Beyond a bit off lag on the Image viewer, they had no lag.
Take a look at the Video's off the HTC HD2 ( Snapdragon ) ... And find the lag there please...
I have seen a few people like you before on other forum's, going around all high & mighty about the Tegra. At first i was impressed by its general specs. Until you start to look deeper, and discover that the CPU is slow as hell ( and the second one is even worse ) compared to the Snapdragon / Cortex A8 / ArmV7 design. That the "extra" cores, are just functionality provided from the GPU. And that its 1080p claim, does not come from the version now used.
In fact, Snapdragon also has 1080p capability. See the QSD8672. But you will not find that SmartPhone's just yet. Just like the Tegra 650 with its 1080p. Has anybody even seen a Tegra 650 on the market? I don't think so ( for good reason ). Looks like another Paper launch from Nvidia.
Simply put:
As of July, 2009 or Oct 2009 for that matter:
Snapdragon mobile phones = shipping.
Tegra mobile phones = vapourware. (not even any firm rumours)
Benjiro said:
Lag?
And Ganondolf regarding the lag that you report? To be honest, i have shown several movies to a friend with WM6.5 + Touchflow backported on older HTC devices ( devices with the same slow cpu's, like the Tegra uses ). Guess what... Beyond a bit off lag on the Image viewer, they had no lag.
Take a look at the Video's off the HTC HD2 ( Snapdragon ) ... And find the lag there please...
Click to expand...
Click to collapse
the lag i was talking about was on the toshiba tg01 which i have played with. there is no point saying look at videos of the htc hd2 as i saw vids of the tg01 which looked like it was lag free, till the hd2 comes out and i have a play i (we) wont be able to tell if its lag free or not. as i can see u are making your argument about lag on a phone that has not been released which i think is a rubbish argument, as someone could say a tegra phone could teleport you across the world (there is no proof).
Also im not on the tegra bandwagon as i like snapdragon just as much, i was going by what i had heard on the net. maybe like you said information has been made to look like the tegra chip is super powerful compared to all the other phone cpu's, what is not true but till i see a phone with a tegra chip in it how would we know?
agitprop said:
Simply put:
As of July, 2009 or Oct 2009 for that matter:
Snapdragon mobile phones = shipping.
Tegra mobile phones = vapourware. (not even any firm rumours)
Click to expand...
Click to collapse
By far the most important point.
Far more important than the MHz number which may or may not even indicate greater or lesser performance or battery life than a competitor with an entirely different architecture.
There is one piece of info that I haven't been able to find. Which one of the two has better performance when it comes to battery power usage?
Anyone?
Tegra is right on the ball.
Yes, the ARM11 cpu is theoretically 1/3 the speed of the Cortex but don't forget there's an ARM7 offloading network traffic, 2D acceleration separate from the CPU and GPU, dedicated HD encoding hardware (decoding is common on both) and sound acceleration. Many of the processing bottlenecks in a mobile device are successfully offloaded in the tegra, ultimately giving the ARM11 less tasks to cope with in the first place, and no need for thread balancing which, fingers crossed, leads to more stable os performance. Another thing to note is that nVidia's official specs say ARM11 MPCore, which means that various tegra chips could have anywhere from 1 to 4 ARM11 cores (the tegra chipset used in the Microsoft Zune player was a duel-core ARM11).
The main point though I think is the power. You don't need a massive CPU in a mobile device, what you need is battery life, which although we haven't received final figures, the tegra is looking infinitely more impressive than anything else on the market. If my iPhone 3GS is anything to go off even x2 the battery life would be welcome, this thing dies in no time at all be it browsing the web, playing video or music; reviews show snapdragon phones to be even worse than this. The nVidia specs regarding battery in earlier posts are mostly accurate but based on a netbook battery. The Zune HD running the tegra has 33hours of audio, 8.5 hours of video, however uses only a 660mAh battery; this is half the size of the battery on the iPhone 3GS and HTC Touch HD2 for example.
The tegra GPU is a powerful CUDA based design and will allow for GPGPU acceleration of the only major computationally intensive task that phones are likely to do in the future which is image processing for augmented reality.
They've provided on-chip support for most modern input/output devices.
nVidia have covered all the bases, I'm seriously looking forward to tegra phones.
Yes, but as I've learned (the hard way) from my Touch Pro, all the features in the world mean nothing if they're not used. Touch Pro was supposed to have video acceleration and double the speed of my old Tytn. Where are those? Nowhere. Why? Some say "there aren't any drivers for the GPU", others say that TPs processor may be 500MHz, but its design is worse than the one in my older Tytn...
I don't care. As a customer, user and buyer, I know that my older phone was faster than my new one. If in the near future we have a Snapdragon 1GHz phone that does everything in its CPU and a Tegra phone that ballances cpu-gpu-physics-whatever in different parts of its design, history says that the Snapdragon will be the better choice. You see, WM Solitaire, Word Mobile, RSS Readers, Twitter clients and all existing software, at least for WM, is written to run on a single processor. I've yet to see a good program/game that will actually take advantage of any devices GPU - and that won't happen while the market is split, for a developer would need to create his program for a specific device (meaning less profit) or simply forego any acceleration and create something "that runs anywhere". We can thank Microsoft for going the Linux way and advocating device makers doing whatever they want, whichever way they want, without some standard way of using different hardware parts (like, say, DirectX in Windows).
very interesting informations.
Battery life is really important, that's at the moment the only advantage of the Tegra vs SN.
I am really keen to know if Manila works also fast with less CPU-Power of the Tegra-Chip as the Leo.
There must be some driver or software problem I would say - because there's no PDA out with the Tegra.
Also no announcement... otherwhise it could be also a strategy from HTC that they didn't get a problem in selling the Leo and oncoming Android-device.
So we must w8...
I think you guys should see PGR on the Zune HD.
Stunning graphics.
For me the processor speed will come 2nd place to functionality. I have recently started to use the remote desktop on my HD, but wish it had a TV out like my Touch Pro.
I was thinking about upgrading to a Leo but that has no TV also.
Discussing advanced graphics for a Snapdragon is not helpful if you are restricted to 4 inches.
Hopefully HTC will put HDMI or at least video out on all future devices. The resolution of the devices is upto it, so why not.

Playstation2 for Android (samsung galaxy)

So, with the new Samsung Galaxy on its way (waiting for some carries to get a move on) there (to me) seems to be a possibility to get a PS2 emulator running quite well with the new specs.
1Ghz HummingBird "Cortex A8"
PowerVR SGX540
---{"Samsung Galaxy S’ “Hummingbird” A8 chip will be able to process around 90 million triangle per second. That is compared to the Moto DROID’s 7 mill tri/sec, the Nexus One’s 22 million tri/sec, and the iPhone 3G S’ 28 million tri/sec."}---
---{"In other words, the Samsung Galaxy S will have around 36% the video processing power of a PS3. Hopefully it doesn’t get as hot as a PS3."}---
With this in mind I would think that is it quite possible to run a PS2 emulator on the new Samsung Galaxy S. Not to mention the rumored 1.5Ghz dual core Snapdragon coming to T-mobile either this Christmas season or early next year.
One thing to remember, is that although a PC with say a 3Ghz Dual core with 4Gb ram trying to run a PS2 emulator runs like crap, the architecture of the PC processor and graphics is different form that of consoles, which is why it requires to much to get a smooth play out of it. Cell phones share a very similar structure (from my knowledge at least) to consoles. This to me says that newer android phones should be quite capable of running a PS2 emu.
If you head over to the GLBenchmark website (.com) and look up the result database you will see the Galaxy S at the top (minus a comal naz-10, whatever that is) and if you compare the Galaxy S results with the Droid, Droid X, Droid 2, Iphone 4, you will see that it just rapes each phone by a huge range. I am not sure of playstion 2 specs but I am more then sure the phone should be able to handle it!
Playstion2 specs can be found on wikipedia (will not copy and paste all that info.)
To me it seems like its highly possible, and I would love to play my racing games on the phone (Tokyo Extreme Racer Drift2, TXRD2)
Thoughts and opinions welcome, no bashing (I get this in other forums).
even a 3 Ghz i7 isn't able to emulate a ps2 @fullspeed (depending by the emulated game - sure, there are many playable games - i know that because im interested in emulation and tested many games (search youtube for "frankyfife"). there is many code to translate by the emu, to produce native code for the plattform running on. the ps2 has vector units, the emotion engine, spu and gs which need to be emulated. no way to do this an a 1 Ghz cellphone, even with similar specs or identical main cpu architecture/function.
I really hate to be a nerf herder but if a 1Ghz snapdragon droid can play playstation one games, and the galaxy s with 1Ghz hummingbird and graphics chip that is way more powerful then the droid should be able to handle it fine.
Take for example facts that lead to a hypothesis of power.
Motorola Droid: TI OMAP3430 with PowerVR SGX530 = 7-14 million(?) triangles/sec
Samsung Galaxy S: S5PC110 with PowerVR SGX540 = 90 million triangles/sec
These results are based off SOME facts with SOME uncertainties that leads to a hypothesis. If this is INDEED the case, the galaxy S is ALMOST 7 times more powerful then the droid (6.4xxxxxxxx when 90 is divided by 14). And your saying that it can't handle it without trying? I've seen youtube video's of phones playing playstation games smoothly with little jitterbugging and medium quality sound. Take into account the faster processor and cpu in the galaxy s and you use less resources to play the game, leaving more for sound processing, which in turn will make the ps1 games run perfect (theoretically) and possible ps2 if not DECENT ps2.
EDIT: Not to mention the PS3 running at 250 million triangles/sec, that makes the galaxy s like 38 some % of a ps3!
No, just no. It can't be done with cellphones as @xdaywalkerx said. I have been able to play Guilty Gear and some visual novels on PS2 emulator on my i5 @ 4.00GHz and with 4GB of DDR3 RAM. Unless you find a way to efficiently emulate all the hardware in PS2 it is impossible.
Quintasan said:
No, just no. It can't be done with cellphones as @xdaywalkerx said. I have been able to play Guilty Gear and some visual novels on PS2 emulator on my i5 @ 4.00GHz and with 4GB of DDR3 RAM. Unless you find a way to efficiently emulate all the hardware in PS2 it is impossible.
Click to expand...
Click to collapse
PC Processors and GPUs work completely different then consoles, that's why it takes so much power to even try to squeeze out performance. Phones have the same if not extremely similar processors and gpu's (at least how they are made and how they work).
Running a emulator on a phone is different then a PC. If the droid can run final fantasy and other games from playstation one, then what is the galaxy gonna be able to do with over 6x more graphics processing power?
keep on dreaming
Just stop, it is impossible. It doesn't matter if the architecture is similar, you're still emulating which takes way more resources than the native machine requires.
Sent from my PC36100 using XDA App
namcost said:
PC Processors and GPUs work completely different then consoles, that's why it takes so much power to even try to squeeze out performance. Phones have the same if not extremely similar processors and gpu's (at least how they are made and how they work).
Running a emulator on a phone is different then a PC. If the droid can run final fantasy and other games from playstation one, then what is the galaxy gonna be able to do with over 6x more graphics processing power?
Click to expand...
Click to collapse
You can't just take theoretical numbers like that and simply assume that just because the Hummingbird can crunch out (throwing a random number right here) 15 million polygons/second, it doesn't mean that it can emulate PS2 titles and crunch out 15 million polygons/second emulating a PS2 title.
As xdaywalkerx said, the Emotion Engine is much more difficult to emulate when compared to the PlayStation 1's MIPS R3051. PS2 emulation is not even well done on Windows computers; not necessarily because of the lack of CPU/GPU power, but the difficulty in emulating the titles as well.
Hell, the Droid can't even run every single PS1 title available, even when overclocked.
how about a psp emu? some psp games look and feel like ps2 games.
Maybe possible with very dumbed down graphics and super-low resolution... but then would it look like ps2? Probably not
SNES StarFox and Stunt Race FX don't run full speed on my Galaxy S.
Burnout 3? Vice City? GOW? MGS2? No chance.
But a Sega Saturn emulator...well...
I've seen the captivate run crash bandicoot 3 on psx emu @ full speed with no problems, just lack of control since its touch screen and requires quick reactions.....
It's simply not possible.
I'd say... it won't work. The processor wouldn't even run it...
The GPU would fail.
However,
A psp emulator, could potentially work.
The facts
You see, a standard PSP (not the PSP Go) is overclocked automatically to 333mhz for SOME games... This 333mhz is the maximum. Most games run at 266mhz. To Emulate something you need roughly 4 times the processing power. And for graphics, you also would need a decent GPU.
So processing wise, a PSP emulator for phones is actually very possible. The graphics could possibly be pulled off.
But this would only work on High end phones with a decent enough screensize... e.g. the streak, droid (X) to name a few.
Edit:
Did some research.
Pixel Fill Rate of the PSP's GPU is 664 Megapixels per second, on a high end phone the GPU is around 133 to 250 Megapixels per second. The PSP does 33 Million triangles a second.. Whereas, we'll get possibly 7 to 22 million triangles per second. This shows that even a emulating a PSP entirely would be impossible... However you COULD emulate it. It just never would be full speed..
So if a PSP, won't run perfectly, I'm afraid a PS2 emulator won't.
Synyster_Zeikku said:
Pixel Fill Rate of the PSP's GPU is 664 Megapixels per second, on a high end phone the GPU is around 133 to 250 Megapixels per second. The PSP does 33 Million triangles a second.. Whereas, we'll get possibly 7 to 22 million triangles per second. This shows that even a emulating a PSP entirely would be impossible... However you COULD emulate it. It just never would be full speed..
So if a PSP, won't run perfectly, I'm afraid a PS2 emulator won't.
Click to expand...
Click to collapse
Samsung Galaxy S is rumored to be super powerful compared to the measly droid.
It is also rumored to have 90 million triangles per second.
http://www.androidpolice.com/2010/07/03/samsung-galaxy-s-is-a-beast-runs-quake-3-perfectly/
I hate to be an ass but the PS3 has 250 million triangles per second from what I've seen around the web (rsx chipset?), the psp is no where near that entirely. PS3 runs the RSX chip? or w/e it is, and its said to run 250 million triangles per second, and also seen a comparison (but i don't really believe it) says the 360 does 500 million triangles per second.
"66 million vertices / triangles per second calculated by the Emotion
Engine, and 75 million triangles per second can be drawn by the
Graphics Synthesizer (obviously the EE can only feed 66M per second to
the GS, thus as a result the EE can never overload the GS "
Click to expand...
Click to collapse
"PSP can *calculate* 33 or 35 million vertices / triangles per second
at the full 333 MHz clock frequency, which currently restricted to 222
MHz, so that cuts vertex / triangle rate down by 1/3. so, this
33~35 million per sec is currently at about 22-23 million per sec. at
222 MHz. Remember, this is the amount that can be transformed /
calculated, so you can think of this PSP triangle/sec number as you
would the 66M per sec that Emotion Engine in PS2 does. "
Click to expand...
Click to collapse
http://www.tomshardware.com/forum/33327-13-versus-triangles-second
I still think its possible with newer phones, especially if the dual core 1.5ghz snapdragon comes out @ christmas like its rumored.....
You're confusing two entirely different things.
Yes, high-end Android phones are able to run games that are similar in graphics to the PSP/PS2.
But emulation? Impossible. To emulate a system, you generally need to be at least 3 times as powerful, and that's probably way too little.
If it was this easy, you'd think the people that made the PS2 themselves would be able to emulate it on the PS3.
Lesiroth said:
You're confusing two entirely different things.
Yes, high-end Android phones are able to run games that are similar in graphics to the PSP/PS2.
But emulation? Impossible. To emulate a system, you generally need to be at least 3 times as powerful, and that's probably way too little.
If it was this easy, you'd think the people that made the PS2 themselves would be able to emulate it on the PS3.
Click to expand...
Click to collapse
They did emulate it on the PS3, they took it out on the newer models for god knows what reason. I have the original PS3 from launch and it plays all my PS2 games without a hickup.....
And where do you get this 3x more powerful, if that's the case, my dual core amd 3.0ghz with 4 gig of ram and a 5770 should run ps2 games just fine and it dont, its laggy.
Emulation on a PC is massively different then emulating on a phone. The phones shares more architecture with consoles then actual PC's do, hence why phones are just now hitting the 1ghz and 1.5ghz level. There are already videos of the galaxy s running crash bandicoot 3 with the droid emulator set to 60fps max and it runs perfectly, and I mean PERFECTLY. (except lack of controls). The Galaxy S also runs quake 3 arena perfectly (minus lack of controls, but that one i think can be solved with a simple bluetooth mouse and keyboard?).
Its possible, people just like to write it off..... w/e, I'm done with this website, too many haters with no facts.
namcost said:
They did emulate it on the PS3, they took it out on the newer models for god knows what reason. I have the original PS3 from launch and it plays all my PS2 games without a hickup.....
And where do you get this 3x more powerful, if that's the case, my dual core amd 3.0ghz with 4 gig of ram and a 5770 should run ps2 games just fine and it dont, its laggy.
Emulation on a PC is massively different then emulating on a phone. The phones shares more architecture with consoles then actual PC's do, hence why phones are just now hitting the 1ghz and 1.5ghz level. There are already videos of the galaxy s running crash bandicoot 3 with the droid emulator set to 60fps max and it runs perfectly, and I mean PERFECTLY. (except lack of controls). The Galaxy S also runs quake 3 arena perfectly (minus lack of controls, but that one i think can be solved with a simple bluetooth mouse and keyboard?).
Its possible, people just like to write it off..... w/e, I'm done with this website, too many haters with no facts.
Click to expand...
Click to collapse
Well, emulating process is the same on all architectures - creating virtual machine and "translating it" to be understandable for device's architecture. Of course it's not that simple, but hope you understand . Even if sb wrote PS2 emulator, I doubt it'll have over 5 fps.
Quake 3 is running smooth, because it's running natively (ported engine for ARM and GPU is supporting OpenGL, which quake uses). Maybe PSX is running great on Galaxy S, but even my very old PC with Pentium III 400MHz and geforce 2 mx could run it at full speed
Oh and your PS3 is running PS2 games smooth, because first consoles had PS2's chip inside . They removed it later.
How about you get your facts straight first?
It was on the first batch of PS3s because Sony put some of the PS2s hardware in the PS3, as they couldn't possibly launch without backwards compatibility.
They took the PS2 hardware out later to reduce costs.
Emulation on phones is not "massively different" than PCs, our phones use ARM architecture CPUs, while the PS2 uses MIPS processors for its Emotion Engine.
make an emulator that works and we will buy it. shouldn't be hard since you seem to know a lot about it

[INFO] Mali-400MP GPU vs Adreno 220 GPU

Mali-400 MP is a GPU (Graphics Processing Unit) developed by ARM in 2008. Mali-400 MP supports a wide range of use from mobile user interfaces to smartbooks, HDTV and mobile gaming. Adreno 220 is a GPU developed by Qualcomm in 2011 and it is a component of the MSM8260 / MSM8660 SoC (System-on-Chip) powering the upcoming HTC EVO 3D, HTC Pyramid and Palm’s TouchPad tablets.
Mali™-400 MP
Mali™-400 MP is the world’s first OpenGL ES 2.0 conformant multi-core GPU. It provides support for vector graphics through OpenVG 1.1 and 3D graphics through OpenGL ES 1.1 and 2.0, thus provides a complete graphics acceleration platform based on open standards. Mali-400 MP is scalable from 1 to 4 cores. It also provides the AMBA® AXI interface industry standard, which makes the integration of Mali-400 MP into SoC designs straight-forward. This also provides a well-defined interface for connecting Mali-400 MP to other bus architectures. Further, Mali-400 MP has a fully programmable architecture that provides high performance support for both shader-based and fixed-function graphics APIs. Mali-400 MP has a single driver stack for all multi-core configurations, which simplifies application porting, system integration and maintenance. Features provided by Mali-400 MP includes advanced tile-based deferred rendering and local buffering of intermediate pixel states that reduces memory bandwidth overhead and power consumption, efficient alpha blending of multiple layers in hardware and Full Scene Anti-Aliasing (FSAA) using rotated grid multi sampling that improves the graphics quality and performance.
Adreno 220
In 2011 Qualcomm introduced Adreno 220 GPU and it is a component of their MSM8260 /MSM8660 SoC. Adreno 220 supports console-quality 3D graphics and high-end effects such as vertex skinning, full-screen post-processing shader effects, dynamic lighting with full-screen alpha blending, real-time cloth simulation, advanced shader effects like dynamic shadows, god rays, bump mapping, reflections, etc and 3D animated textures. Adreno 220 GPU also claims that it can process 88 million triangles per second and offers twice the processing power of its predecessor Adreno 205. Further, Adreno 220 GPU claims to boost the performance up to a level that is competitive with console gaming systems. Also, Adreno 220 GPU will allow running games, UI, navigation apps and web browser in largest display sizes with lowest power levels.
Difference Between The Two
Difference between Mali-400MP GPU and Adreno 220 GPU
Based on a research done by Qualcomm using an average of Industry benchmarks composed of Neocore, GLBenchmark, 3DMM and Nenamark, they claim that Adreno 220 GPU in Qualcomm’s dual-core Snapdragon MSM8660 offers twice the performance of the GPU in other leading dual-core ARM9-based chips. Also, a team known as Anandtech has done several tests on Adreno 220 GPU. One of them was the GLBenchmark 2.0, which records the performance of OpenGL ES 2.0 compatible devices such as Mali™-400 MP using two long suites that include a combination of different effects such as direct lighting, bump, environment, radiance mapping, soft shadows, texture based on the use of vertex shader, deferred multi-pass rendering, texture noise, etc. and the test showed that Adreno 220 GPU was 2.2 times faster than the other existing devices such as Mali-400 MP GPU.
What do you guys think of this??.... I've been with HTC since they started doing Android... and I have to say android has come along way and so has the hardware....
thanks
thanks, i'be been looking for feedback regarding adreno 220 vs mali 400 GPUs, do you have any link or source to back this up
I am looking forward to buying the sensation, and my only concern is the adreno 220 GPU as and whether it is better, equal, or workse than the Mali 400
If it is better then i'm definitely buying the sensation, if not then i might consider the galaxy s II
Thanks
interesting read... the sgs2 fans will debate this and say that the benchmarks were done with an over clocked processor.. so as of now its all just a good read
i don't know if this video helps but the gpu upscaling and running on a larger screen with higher resolution and not falling back is kind of amazing
http://www.youtube.com/watch?v=RBBMVc9-fuk
boostedb16b said:
i don't know if this video helps but the gpu upscaling and running on a larger screen with higher resolution and not falling back is kind of amazing
http://www.youtube.com/watch?v=RBBMVc9-fuk
Click to expand...
Click to collapse
Nice post! I'm completely impressed. Amazing that that was being up converted from a friggin cell phone to large HDTV!
I'm sold!
another thing is that sony went with the adreno 220 in the xperia play so i guess it is as good as they say it is
boostedb16b said:
another thing is that sony went with the adreno 220 in the xperia play so i guess it is as good as they say it is
Click to expand...
Click to collapse
Actually they went with the Adreno 205 in the Xperia Play, which is the same gpu as in the Desire HD
Sent from my HTC Desire HD using XDA Premium App
boostedb16b said:
another thing is that sony went with the adreno 220 in the xperia play so i guess it is as good as they say it is
Click to expand...
Click to collapse
Isn't it Adreno 205 ?
http://www.gsmarena.com/sony_ericsson_xperia_play-3608.php
Adreno 220 is faster than Mali-400MP anyday, and compared to Tegra 2 it's better in some cases and worst in others. I didn't get a Galaxy S2 due to the fact that the Mali-400MP is soo antiquetted...
According to reports on the SGS 2 forum, the Mali 400 doesn't support textures? This seems a big blow on the SGS 2's gaming capabilities to me. What would be an interesting comparison would be the Adreno 220 vs the (LG Optimus 3D's) OMAP4 PowerVR SGX400 GPU, which is clocked at 300MHz instead of the 200MHz found in the samsung hummingbird processor....
Ian
Beaker491 said:
According to reports on the SGS 2 forum, the Mali 400 doesn't support textures? This seems a big blow on the SGS 2's gaming capabilities to me. What would be an interesting comparison would be the Adreno 220 vs the (LG Optimus 3D's) OMAP4 PowerVR SGX400 GPU, which is clocked at 300MHz instead of the 200MHz found in the samsung hummingbird processor....
Ian
Click to expand...
Click to collapse
Mali 400 Does support textures its texture compression which it does not support.
Elchemist said:
Mali 400 Does support textures its texture compression which it does not support.
Click to expand...
Click to collapse
It does support texture compression, just not the proprietary formats of other GPU vendors. Developers that only code for proprietary formats get locked in and lose out when something better comes along.
i surely was waiting for an idiot response like yours i9100 just face it that samsung made a mistake with the mali... and to be quite honest with you the sgs2 was rushed out to take the lead on the market b4 the sensation.. and the cpu was overclocked to be more competitive with the sensation... and the rush has shown what i expected lots of unknown problems in the phone and then returns for phones with faulty screens and so on so dont be a troll and state facts rather than just talk because you have the device
Mail 400 is just an outdated chip, why someone wouldn't include texture compression in a MOBILE gpu is beyond me. What the hell were they thinking?!! The PS3 only does this because it has a blu-ray sized storage media! Ridiculous.
As stated, it DOES support texture compression. There are just a few formats and it only supports one of those. It won't prove a problem as it seems the SGS2 is going to be extremely popular and all game devs will support it eventually (most do already)
boostedb16b said:
i surely was waiting for an idiot response like yours i9100 just face it that samsung made a mistake with the mali... and to be quite honest with you the sgs2 was rushed out to take the lead on the market b4 the sensation.. and the cpu was overclocked to be more competitive with the sensation... and the rush has shown what i expected lots of unknown problems in the phone and then returns for phones with faulty screens and so on so dont be a troll and state facts rather than just talk because you have the device
Click to expand...
Click to collapse
Wow, that was a slight over reaction! Do you work for HTC or something because that was a very defensive reply.
i9100 actually speaks some truth. There is a very informed topic over on the SGS2 forum about this which i suggest you guys read before making any more statements which are incorrect.
Still, looking forward to seeing this phone released to see if it's as impressive as the S2 is. I'm hoping it will be.
boostedb16b said:
i surely was waiting for an idiot response like yours i9100 just face it that samsung made a mistake with the mali... and to be quite honest with you the sgs2 was rushed out to take the lead on the market b4 the sensation.. and the cpu was overclocked to be more competitive with the sensation... and the rush has shown what i expected lots of unknown problems in the phone and then returns for phones with faulty screens and so on so dont be a troll and state facts rather than just talk because you have the device
Click to expand...
Click to collapse
I wouldn't be surprised if that was the case. To Samsung, that's business as usual.
I'm biased and this is not the Samsung home turf so I'll just keep to correcting facts. A little flaming is fine and not unexpected, but I won't go there. I'm enjoying my phone, and you should enjoy yours ;-)
In OpenGL ES 2.0 there is only one standard texture compression format - ETC. It's the only one you can rely on in all conformant GPUs. Others like ATITC, PVRTC and S3TC/DXTC are proprietary formats not suitable if you want your app to run on new devices.
at the moment since my hd2 has been retired i opted to buy a samsung galaxy s 4g worst mistake i have made in a long time... bought a sidekick 4g for my gf and that phone was another example of how poor quality control if for samsung or how they rush to get a head start in the market... the phone is riddled with problems
Currykiev said:
Wow, that was a slight over reaction! Do you work for HTC or something because that was a very defensive reply.
i9100 actually speaks some truth. There is a very informed topic over on the SGS2 forum about this which i suggest you guys read before making any more statements which are incorrect.
Still, looking forward to seeing this phone released to see if it's as impressive as the S2 is. I'm hoping it will be.
Click to expand...
Click to collapse
lol its not as bad as what i have read in the sgs2 forum where people are reporting their problems and trying to get legitimate help and are being bashed and called trolls because their phone honestly has a problem

GPUs

I'm planning to buy a new android phone and my budget is 200 to 250 EUR.
The component thats bugging me a lot is the GPU. I am seeing old Adreno 200 GPUs on new phones like the Desire V.
#1-So is it really a factor that affects the overall performance of the phone?
#2-And which is the best?
I have seen phones equipped with Mali 400MP,Adreno 200,205,220 and 225,SGX 540...and those Tegra chips from LG Optimus Series.
Which one is the best?
#3-And the phone on my mind is Desire X(will be released soon),and many pages say that it comes with an Adreno 203 chip.Now whats Adreno 203?
And hows its performance?
Guys...
Sent from my GT-S5670 using xda app-developers app
yzak58 said:
I'm planning to buy a new android phone and my budget is 200 to 250 EUR.
The component thats bugging me a lot is the GPU. I am seeing old Adreno 200 GPUs on new phones like the Desire V.
#1-So is it really a factor that affects the overall performance of the phone?
#2-And which is the best?
I have seen phones equipped with Mali 400MP,Adreno 200,205,220 and 225,SGX 540...and those Tegra chips from LG Optimus Series.
Which one is the best?
#3-And the phone on my mind is Desire X(will be released soon),and many pages say that it comes with an Adreno 203 chip.Now whats Adreno 203?
And hows its performance?
Click to expand...
Click to collapse
GPU's are not the biggest factor no, as long as the CPU and RAM is enough overall performance will not be effected by the GPU.
Some games that are very 3D intensive would befit from a more powerful GPU yes, and for some games the Tegra 3 chip allows for better shading and water effects etc
thanks zac
GPU are saparated ram allocated for gaming..
More the gpu better the gaming performance...
It means 400mali is better than 200 adreno..
Other thing gpu does not effects over all performance but it effects clarity of graphics and display visualiTy...
So in 250 eur.
I Think galaxy S2 is good choice..
Good processor
Good gpu
Good screen resolution..
we all should be polite enough to press thanks for anyone who helped US.
i think ram comes first.
larger ram can make your phone work smoother(except games).
thanks
ok guys :good:
rainbow9 said:
i think ram comes first.
larger ram can make your phone work smoother(except games).
Click to expand...
Click to collapse
Actually its both . RAM also has a major impact on games. The better the GPU, the lower the impact on the RAM since the device won't need to be put under too much "strain" to process the graphics (also requiring a good CPU).
GPU IS IMPORTANT FOR SMOOTH OS PERFORMANCE. The current OS uses GPU acceleration to smooth things out ig. ICS and JB. Many ROMS also enable GPU to increase performance throughout the OS. If you have a snapdragon, then it uses RAM from the phone for RAM on the GPU where as Tegra has it's own dedicated RAM for its GPU.
AJ88 said:
GPU are saparated ram allocated for gaming..
More the gpu better the gaming performance...
It means 400mali is better than 200 adreno..
Other thing gpu does not effects over all performance but it effects clarity of graphics and display visualiTy...
So in 250 eur.
I Think galaxy S2 is good choice..
Good processor
Good gpu
Good screen resolution..
we all should be polite enough to press thanks for anyone who helped US.
Click to expand...
Click to collapse
The no. in the card's name does not reflect the ram it has.It reflects the model number.And of course the Mali mp-400 is better than the Adreno 200.It performs better than the Tegra 2.
Here's the performance order of previous generation chips :
Mali Mp-400>PowerVR SGX 540>Adreno 205 >> Tegra 2.
Maybe the Adreno 205 isn't THAT much better than the Tegra 2,but the Tegra 2 is highly over-rated,and the Mali mp-400 pulls cleanly ahead of it.
RoboWarriorSr said:
GPU IS IMPORTANT FOR SMOOTH OS PERFORMANCE. The current OS uses GPU acceleration to smooth things out ig. ICS and JB. Many ROMS also enable GPU to increase performance throughout the OS. If you have a snapdragon, then it uses RAM from the phone for RAM on the GPU where as Tegra has it's own dedicated RAM for its GPU.
Click to expand...
Click to collapse
oh..thanks for the info...
So you are saying that Tegra chips come with its own inbuilt RAM?
So...much mbs of RAM(or RAM equivalent or whatever) is in a Tegra chip?
yzak58 said:
oh..thanks for the info...
So you are saying that Tegra chips come with its own inbuilt RAM?
So...much mbs of RAM(or RAM equivalent or whatever) is in a Tegra chip?
Click to expand...
Click to collapse
I think it's 64MB or something like that for the Tegra 2.Doesn't really matter though.If you get anything better than the Adreno 200,it's good.
do Samsung galaxy mini has GPU?
Go for Tegra 3, mate
beakolang said:
do Samsung galaxy mini has GPU?
Click to expand...
Click to collapse
Yes, it does have a GPU. It has an Adreno 200.
You know, I have been using for some time a(pretty old by now) LG Optimus One. It has an Adreno 200 GPU and an ARMv6 600 Mhz CPU.
Even if I overclock it to 800Mhz and maximize the ROM performance in every way possible, GTA3 for example runs pretty much non-playable(very low FPS).
The Optimus One uses a Qualcomm MSM7227 SoC(2009). But in 2011 Qualcomm released the MSM7227A(used for example in Galaxy Mini 2) which also has an Adreno 200 for GPU, but it uses a much better ARMv7 800Mhz Cortex-A5 CPU. The GPU coupled with this much more capable CPU handles GTA 3 really good, playable without problems.
That's really interesting to me, to say the least. It's like you would have a good video card in your PC, but it was bottlenecked by the CPU. And Adreno 200 is quite old.
-
nundoo said:
You know, I have been using for some time a(pretty old by now) LG Optimus One. It has an Adreno 200 GPU and an ARMv6 600 Mhz CPU.
Even if I overclock it to 800Mhz and maximize the ROM performance in every way possible, GTA3 for example runs pretty much non-playable(very low FPS).
The Optimus One uses a Qualcomm MSM7227 SoC(2009). But in 2011 Qualcomm released the MSM7227A(used for example in Galaxy Mini 2) which also has an Adreno 200 for GPU, but it uses a much better ARMv7 800Mhz Cortex-A5 CPU. The GPU coupled with this much more capable CPU handles GTA 3 really good, playable without problems.
That's really interesting to me, to say the least. It's like you would have a good video card in your PC, but it was bottlenecked by the CPU. And Adreno 200 is quite old.
-
Click to expand...
Click to collapse
It has an enhanced Adreno 200.That's how it gets better graphics score in AnTuTu.And I'm surprised you can't run GTA3.I can play Dead Space no lag on my Wildfire S,even at stock,and that looks just as intensive as GTA3.
Although I do agree the CPU might be a bottleneck,it shouldn't affect 3D gaming.The UI becomes really smooth @ 825Mhz,which surprises me as it lags in comparison at even 806Mhz.
Dead Space also runs very good on Optimus One, GTA 3 is much more demanding.
It has to do with the fact that GTA is an open world game which requires more background processing rather than current processing that the majority of android games use. I believe that the CPU does the background processing which is why it lags. This also explains why the galaxy mini can play GTA while having a similar clocked CPU, the architecture.
soo
Soo is the desire x better than tegra 2?
Or more detailt
Htc desire x is it better than my lg optimus 2x.
Htc has more ram. But i dont like that i has the adreno 203 is it ****?
Help plz

TEGRA 4 - 1st possible GLBenchmark!!!!!!!! - READ ON

Who has been excited by the Tegra 4 rumours?, last night's Nvidia CES announcement was good, but what we really want are cold-hard BENCHMARKS.
I found an interesting mention of Tegra T114 SoC on a Linux Kernel site, which I've never heard of. I got really interested when it stated that the SoC is based on ARM A15 MP, it must be Tegra 4. I checked the background of the person who posted the kernel patch, he is a senior Nvidia Kernel engineer based in Finland.
https://lkml.org/lkml/2012/12/20/99
"This patchset adds initial support for the NVIDIA's new Tegra 114
SoC (T114) based on the ARM Cortex-A15 MP. It has the minimal support
to allow the kernel to boot up into shell console. This can be used as
a basis for adding other device drivers for this SoC. Currently there
are 2 evaluation boards available, "Dalmore" and "Pluto"."
On the off chance I decided to search www.glbenchmark.com for the 2 board names, Dalmore (a tasty whisky!) and Pluto (Planet, Greek God and cartoon dog!) Pluto returned nothing, but Dalmore returned a device called 'Dalmore Dalmore' that was posted on 3rd January 2013. However the OP had already deleted them, but thanks to Google Cache I found the results
RESULTS
GL_VENDOR NVIDIA Corporation
GL_VERSION OpenGL ES 2.0 17.01235
GL_RENDERER NVIDIA Tegra
From System spec, It runs Android 4.2.1, a Min frequency of 51 MHz and Max of 1836 Mhz
Nvidia DALMORE
GLBenchmark 2.5 Egypt HD C24Z16 - Offscreen (1080p) : 32.6 fps
iPad 4
GLBenchmark 2.5 Egypt HD C24Z16 - Offscreen (1080p): 49.6 fps
CONCLUSION
Anandtech has posted that Tegra 4 doesn't use unified shaders, so it's not based on Kepler. I reckon that if Nvidia had a brand new GPU they would have shouted about it at CES, the results I've found indicate that Tegra 4 is between 1 to 3 times faster than Tegra 3.
BUT, this is not 100% guaranteed to be a Tegra 4 system, but the evidence is strong that it is a T4 development board. If this is correct, we have to figure that it is running beta drivers, Nexus 10 is ~ 10% faster than the Arndale dev board with the same Exynos 5250 SoC. Even if Tegra 4 gets better drivers, it seems like the SGX 544 MP4 in the A6X is still the faster GPU, with Tegra 4 and Mali T604 being an almost equal 2nd. Nvidia has said that T4 is faster than A6X, but the devil is in the detail, in CPU benchmarks I can see that being true, but not for graphics.
UPDATE - Just to add to the feeling that that this legit, the GLBenchmark - System section lists the "android.os.Build.USER" as buildbrain. Buildbrain according to a Nvidia job posting is "Buildbrain is a mission-critical, multi-tier distributed computing system that performs mobile builds and automated tests each day, enabling NVIDIA's high performance development teams across the globe to develop and deliver NVIDIA's mobile product line"
http://jobsearch.naukri.com/job-lis...INEER-Nvidia-Corporation--2-to-4-130812500024
I posted the webcache links to GLBenchmark pages below, if they disappear from cache, I've saved a copy of the webpages, which I can upload, Enjoy
GL BENCHMARK - High Level
http://webcache.googleusercontent.c...p?D=Dalmore+Dalmore+&cd=1&hl=en&ct=clnk&gl=uk
GL BENCHMARK - Low Level
http://webcache.googleusercontent.c...e&testgroup=lowlevel&cd=1&hl=en&ct=clnk&gl=uk
GL BENCHMARK - GL CONFIG
http://webcache.googleusercontent.c...Dalmore&testgroup=gl&cd=1&hl=en&ct=clnk&gl=uk
GL BENCHMARK - EGL CONFIG
http://webcache.googleusercontent.c...almore&testgroup=egl&cd=1&hl=en&ct=clnk&gl=uk
GL BENCHMARK - SYSTEM
http://webcache.googleusercontent.c...ore&testgroup=system&cd=1&hl=en&ct=clnk&gl=uk
OFFSCREEN RESULTS
http://webcache.googleusercontent.c...enchmark.com+dalmore&cd=4&hl=en&ct=clnk&gl=uk
http://www.anandtech.com/show/6550/...00-5th-core-is-a15-28nm-hpm-ue-category-3-lte
Is there any Gpu that could outperform iPad4 before iPad5 comes out? adreno 320, t Mali 604 now tegra 4 aren't near it. Qualcomm won't release anything till q4 I guess, and tegra 4 has released too only thing that is left is I guess is t Mali 658 coming with exynos 5450 (doubtfully when it would release, not sure it will be better )
Looks like apple will hold the crown in future too .
i9100g user said:
Is there any Gpu that could outperform iPad4 before iPad5 comes out? adreno 320, t Mali 604 now tegra 4 aren't near it. Qualcomm won't release anything till q4 I guess, and tegra 4 has released too only thing that is left is I guess is t Mali 658 coming with exynos 5450 (doubtfully when it would release, not sure it will be better )
Looks like apple will hold the crown in future too .
Click to expand...
Click to collapse
There was a great article on Anandtech that tested the power consumption of the Nexus 10's Exynos 5250 SoC, it showed that both the CPU and GPU had a TDP of 4W, making a theoretical SoC TDP of 8W. However when the GPU was being stressed by running a game, they ran a CPU benchmark in the background, the SoC quickly went up to 8W, but the CPU was quickly throttled from 1.7 GHz to just 800 Mhz as the system tried to keep everything at 4W or below, this explained why the Nexus 10 didn't benchmark as well as we wished.
Back to the 5450 which should beat the A6X, trouble is it has double the CPU & GPU cores of the 5250 and is clocked higher, even on a more advanced 28nm process, which will lower power consumption I feel that system will often be throttled because of power and maybe heat concerns, so it looks amazing on paper but may disappoint in reality, and a 5450 in smartphone is going to suffer even more.
So why does Apple have an advantage?, well basically money, for a start mapple fans will pay more for their devices, so they afford to design a big SoC and big batteries that may not be profitable to other companies. Tegra 4 is listed as a 80mm2 chip, iPhone 5 is 96mm2 and A6X is 123mm2, Apple can pack more transistors and reap the GPU performance lead, also they chosen graphics supplier Imagination Technologies have excellent products, Power VR Rogue will only increase Apple's GPU lead. They now have their own chip design team, the benefit for them has been their Swift core is almost as powerful as ARM A15, but seems less power hungry, anyway Apple seems to be happy running slower CPUs compared to Android. Until an Android or WP8 or somebody can achieve Apple's margins they will be able to 'buy' their way to GPU domination, as an Android fan it makes me sad:crying:
32fps is no go...lets hope it's not final
hamdir said:
32fps is no go...lets hope it's not final
Click to expand...
Click to collapse
It needs to, but it will be OK for a new Nexus 7
still faster enough for me, I dont game alot on my nexus 7.
I know I'm taking about phones here ... But the iPhone 5 GPU and adreno 320 are very closely matched
Sent from my Nexus 4 using Tapatalk 2
italia0101 said:
I know I'm taking about phones here ... But the iPhone 5 GPU and adreno 320 are very closely matched
Sent from my Nexus 4 using Tapatalk 2
Click to expand...
Click to collapse
From what I remember the iPhone 5 and the new iPad wiped the floor with Nexus 4 and 10. The ST-Ericsson Nova A9600 is likely to have a PowerVR Rogue GPU. Just can't wait!!
adityak28 said:
From what I remember the iPhone 5 and the new iPad wiped the floor with Nexus 4 and 10. The ST-Ericsson Nova A9600 is likely to have a PowerVR Rogue GPU. Just can't wait!!
Click to expand...
Click to collapse
That isn't true , check glbenchmark , in the off screen test the iPhone scored 91 , the nexus 4 scored 88 ... That ksnt wiping my floors
Sent from my Nexus 10 using Tapatalk HD
Its interesting how even though nvidia chips arent the best we still get the best game graphics because of superior optimization through tegra zone. Not even the a6x is as fully optimized.
Sent from my SAMSUNG-SGH-I727 using xda premium
ian1 said:
Its interesting how even though nvidia chips arent the best we still get the best game graphics because of superior optimization through tegra zone. Not even the a6x is as fully optimized.
Sent from my SAMSUNG-SGH-I727 using xda premium
Click to expand...
Click to collapse
What sort of 'optimisation' do you mean? un optimised games lag that's a big letdown and tegra effects can also be used on other phones too with chain fire 3d I use it and tegra games work without lag with effects and I don't have a tegra device
With a tegra device I am restricted to optimised games mostly
The graphic performance of NVIDIA SoCs is always disappointed, sadly for the VGA dominanting provider on the world.
The first Tegra2, the GPU is a little bit better than SGX540 of GalaxyS a little bit in benchmark, but lacking NEON support.
The second one Tegra 3, the GPU is nearly the same as the old Mali400Mp4 in GALAXY S2/Original Note.
And now it's better but still nothing special and outperformed soon (Adreno 330 and next-gen Mali)
Strongest PowerVR GPUs are always the best, but sadly they are exclusive for Apple only (SGX543 and maybe SGX 554 also, only Sony ,who has the cross-licencing with Apple, has it in PS Vita and in PS Vita only)
tegra optimization porting no longer works using chainfire, this is now a myth
did u manage to try shadowgun thd, zombie driver or horn? the answer is no, games that use t3 sdk for physx and other cpu graphics works can not be forced to work on other devices, equally chainfire is now outdated and no longer updated
now about power vr they are only better in real multicore configuration which is only used by apple and Sony's vita, eating large die area, ie actual multicore each with its own subcores/shaders, if tegra was used in real multi core it would destroy all
finally this is really funny all this doom n gloom because of an early discarded development board benchmark, I dont mean to take away from turbo's thunder and his find but truly its ridiculous the amount of negativity its is collecting before any type of final device benchs
adrena 220 doubled in performance after the ICS update on sensation
t3 doubled the speed of t2 gpu with only 50% the number of shaders so how on earth do you believe only 2x the t3 scores with 600% more shaders!!
do you have any idea how miserable the ps3 performed in its early days? even new desktop GeForces perform much less than expected until the drivers are updated
enough with the FUD! seems this board is full of it nowadays and so little reasoning...
For goodness sake, this isn't final hardware, anything could change. Hung2900 knows nothing, what he stated isn't true. Samsung has licensed PowerVR, it isn't just stuck to Apple, just that Samsung prefers using ARMs GPU solution. Another thing I dislike is how everyone is comparing a GPU in the iPad 4 (SGX554MP4) that will NEVER arrive in a phone compared a Tegra 4 which will arrive in a phone. If you check OP link the benchmark was posted on the 3rd of January with different results (18fps then 33fps), so there is a chance it'll rival the iPad 4. I love Tegra as Nvidia is pushing developers to make more better games for Android compared to the 'geeks' *cough* who prefers benchmark results, whats the point of having a powerful GPU if the OEM isn't pushing developers to create enhance effect games for there chip.
Hamdir is correct about the GPUs, if Tegra 3 was around 50-80% faster than Tegra 2 with just 4 more cores, I can't really imagine it only being 2x faster than Tegra 3. Plus its a 28nm (at around 80mm2 just a bit bigger than Tegra 3, smaller than A6 90mm2) along with the dual memory than single on Tegra 2/3.
Turbotab said:
There was a great article on Anandtech that tested the power consumption of the Nexus 10's Exynos 5250 SoC, it showed that both the CPU and GPU had a TDP of 4W, making a theoretical SoC TDP of 8W. However when the GPU was being stressed by running a game, they ran a CPU benchmark in the background, the SoC quickly went up to 8W, but the CPU was quickly throttled from 1.7 GHz to just 800 Mhz as the system tried to keep everything at 4W or below, this explained why the Nexus 10 didn't benchmark as well as we wished.
Back to the 5450 which should beat the A6X, trouble is it has double the CPU & GPU cores of the 5250 and is clocked higher, even on a more advanced 28nm process, which will lower power consumption I feel that system will often be throttled because of power and maybe heat concerns, so it looks amazing on paper but may disappoint in reality, and a 5450 in smartphone is going to suffer even more.
So why does Apple have an advantage?, well basically money, for a start iSheep will pay more for their devices, so they afford to design a big SoC and big batteries that may not be profitable to other companies. Tegra 4 is listed as a 80mm2 chip, iPhone 5 is 96mm2 and A6X is 123mm2, Apple can pack more transistors and reap the GPU performance lead, also they chosen graphics supplier Imagination Technologies have excellent products, Power VR Rogue will only increase Apple's GPU lead. They now have their own chip design team, the benefit for them has been their Swift core is almost as powerful as ARM A15, but seems less power hungry, anyway Apple seems to be happy running slower CPUs compared to Android. Until an Android or WP8 or somebody can achieve Apple's margins they will be able to 'buy' their way to GPU domination, as an Android fan it makes me sad:crying:
Click to expand...
Click to collapse
Well said mate!
I can understand what you feel, nowdays android players like samsung,nvidia are focusing more on CPU than GPU.
If they won't stop soon and continued to use this strategy they will fail.
GPU will become bottleneck and you will not be able use the cpu at its full potential. (Atleast when gaming)
i have Galaxy S2 exynos 4 1.2Ghz and 400mhz oc mali gpu
In my analysis most modern games like MC4,NFS:MW aren't running at 60FPS at all thats because GPU always have 100% workload and CPU is relaxing there by outputing 50-70% of total CPU workload
I know some games aren't optimize for all android devices as opposed to apple devices but still even high-end android devices has slower gpu (than ipad 4 atleast )
AFAIK, Galaxy SIV is likely to pack T-604 with some tweaks instead of mighty T-658 which is still slower than iPAddle 4
Turbotab said:
There was a great article on Anandtech that tested the power consumption of the Nexus 10's Exynos 5250 SoC, it showed that both the CPU and GPU had a TDP of 4W, making a theoretical SoC TDP of 8W. However when the GPU was being stressed by running a game, they ran a CPU benchmark in the background, the SoC quickly went up to 8W, but the CPU was quickly throttled from 1.7 GHz to just 800 Mhz as the system tried to keep everything at 4W or below, this explained why the Nexus 10 didn't benchmark as well as we wished.
Back to the 5450 which should beat the A6X, trouble is it has double the CPU & GPU cores of the 5250 and is clocked higher, even on a more advanced 28nm process, which will lower power consumption I feel that system will often be throttled because of power and maybe heat concerns, so it looks amazing on paper but may disappoint in reality, and a 5450 in smartphone is going to suffer even more.
So why does Apple have an advantage?, well basically money, for a start iSheep will pay more for their devices, so they afford to design a big SoC and big batteries that may not be profitable to other companies. Tegra 4 is listed as a 80mm2 chip, iPhone 5 is 96mm2 and A6X is 123mm2, Apple can pack more transistors and reap the GPU performance lead, also they chosen graphics supplier Imagination Technologies have excellent products, Power VR Rogue will only increase Apple's GPU lead. They now have their own chip design team, the benefit for them has been their Swift core is almost as powerful as ARM A15, but seems less power hungry, anyway Apple seems to be happy running slower CPUs compared to Android. Until an Android or WP8 or somebody can achieve Apple's margins they will be able to 'buy' their way to GPU domination, as an Android fan it makes me sad:crying:
Click to expand...
Click to collapse
Typical "isheep" reference, unnecessary.
Why does apple have the advantage? Maybe because there semiconductor team is talented and can tie the A6X+PowerVR GPU efficiently. NIVIDA should have focused more on GPU in my opinion as the CPU was already good enough. With these tablets pushing excess of 250+ppi the graphics processor will play a huge role. They put 72 cores in there processor. Excellent. Will the chip ever be optimized to full potential? No. So again they demonstrated a product that sounds good on paper but real world performance might be a different story.
MrPhilo said:
For goodness sake, this isn't final hardware, anything could change. Hung2900 knows nothing, what he stated isn't true. Samsung has licensed PowerVR, it isn't just stuck to Apple, just that Samsung prefers using ARMs GPU solution. Another thing I dislike is how everyone is comparing a GPU in the iPad 4 (SGX554MP4) that will NEVER arrive in a phone compared a Tegra 4 which will arrive in a phone. If you check OP link the benchmark was posted on the 3rd of January with different results (18fps then 33fps), so there is a chance it'll rival the iPad 4. I love Tegra as Nvidia is pushing developers to make more better games for Android compared to the 'geeks' *cough* who prefers benchmark results, whats the point of having a powerful GPU if the OEM isn't pushing developers to create enhance effect games for there chip.
Hamdir is correct about the GPUs, if Tegra 3 was around 50-80% faster than Tegra 2 with just 4 more cores, I can't really imagine it only being 2x faster than Tegra 3. Plus its a 28nm (at around 80mm2 just a bit bigger than Tegra 3, smaller than A6 90mm2) along with the dual memory than single on Tegra 2/3.
Click to expand...
Click to collapse
Firstly please keep it civil, don't go around saying that people know nothing, people's posts always speak volumes. Also calling people geeks, on XDA is that even an insult, next you're be asking what I deadlift:laugh:
My OP was done in the spirit of technical curiosity, and to counter the typical unrealistic expectations of a new product on mainstream sites, e.g. Nvidia will use Kepler tech (which was false), omg Kepler is like GTX 680, Tegra 4 will own the world, people forget that we are still talking about device that can only use a few watts, and must be passively cooled and not a 200+ watt, dual-fan GPU, even though they both now have to power similar resolutions, which is mental.
I both agree and disagree with your view on Nvidia's developer relationship, THD games do look nice, I compared Infinity Blade 2 on iOS vs Dead Trigger 2 on youtube, and Dead Trigger 2 just looked richer, more particle & physics effects, although IF Blade looked sharper at iPad 4 native resolution, one of the few titles to use the A6x's GPU fully.The downside to this relationship is the further fragmentation of the Android ecosystem, as Chainfire's app showed most of the extra effects can run on non Tegra devices.
Now, a 6 times increase in shader, does not automatically mean that games / benchmarks will scale in linear fashion, as other factors such as TMU /ROP throughput can bottleneck performance. Nvidia's Technical Marketing Manager, when interviewed at CES, said that the overall improvement in games / benchmarks will be around 3 to 4 times T3. Ultimately I hope to see Tegra 4 in a new Nexus 7, and if these benchmarks are proved accurate, it wouldn't stop me buying. Overall including the CPU, it would be a massive upgrade over the current N7, all in the space of a year.
At 50 seconds onwards.
https://www.youtube.com/watch?v=iC7A5AmTPi0
iOSecure said:
Typical "isheep" reference, unnecessary.
Why does apple have the advantage? Maybe because there semiconductor team is talented and can tie the A6X+PowerVR GPU efficiently. NIVIDA should have focused more on GPU in my opinion as the CPU was already good enough. With these tablets pushing excess of 250+ppi the graphics processor will play a huge role. They put 72 cores in there processor. Excellent. Will the chip ever be optimized to full potential? No. So again they demonstrated a product that sounds good on paper but real world performance might be a different story.
Click to expand...
Click to collapse
Sorry Steve, this is an Android forum, or where you too busy buffing the scratches out of your iPhone 5 to notice? I have full respect for the talents of Apple's engineers & marketing department, many of its users less so.
hamdir said:
tegra optimization porting no longer works using chainfire, this is now a myth
did u manage to try shadowgun thd, zombie driver or horn? the answer is no, games that use t3 sdk for physx and other cpu graphics works can not be forced to work on other devices, equally chainfire is now outdated and no longer updated
Click to expand...
Click to collapse
Looks like they haven't updated chain fire 3d for a while as a result only t3 games don't work but others do work rip tide gp, dead trigger etc . It's not a myth but it is outdated and only works with ics and tegra 2 compatible games . I think I (might be) unfortunate too but some gameloft games lagged on tegra device that i had, though root solved it too an extent
I am not saying something is superior to something just that my personal experience I might be wrong I may not be
Tbh I think benchmarks don't matter much unless you see some difference in real world usage and I had that problem with tegra in my experience
But we will have to see if the final version is able to push it above Mali t 604 and more importantly sgx544
Turbotab said:
Sorry Steve, this is an Android forum, or where you too busy buffing the scratches out of your iPhone 5 to notice? I have full respect for the talents of Apple's engineers & marketing department, many of its users less so.
Click to expand...
Click to collapse
No I actually own a nexus 4 and ipad mini so I'm pretty neutral in googles/apples ecosystems and not wiping any scratches off my devices.

Categories

Resources