Samsung Caught Manipulating Galaxy S4 Benchmark Results (I9500) - Galaxy S 4 General

I found that while browsing news so I though I'd share it here.. I'm disappointed because Samsung actually cheating us. :/
--
Smartphone benchmarks never really indicate how a device will ultimately perform, yet tech enthusiasts exalt results like they’re the end all, be all. Check any review; the nebulous collection of numbers are always held in high regard despite them never really doing much to affect the overall experience. Still, companies take them quite seriously—in Samsung’s case, a little too seriously.
According to a new report from AnandTech, Samsung might be fibbing its way to more favorable Galaxy S4 benchmarks. Has your device suddenly come to a crawl? Of course it hasn’t; benchmarks shouldn’t change your perception of a flagship as powerful as the S4. Still, it’s embarrassing that Samsung would resort to such technical tactics, like allegedly using code dubbed “BenchmarkBooster.” Yes, your device takes steroids.
AnandTech found that Samsung set the GPU of the Exynos 5 Galaxy S4 to run higher when benchmarked—higher than normal everyday use. When engineers tested the device, the S4’s Exynos 5 ran at 533MHz during benchmarking, and only ran at 480MHz during regular use. Not an enormous difference, but large enough to call shenanigans.
In addition, AnandTech found that when running CPU benchmarks with apps such as AnTuTu and Quadrant, the device’s Cortex A15 clocked at 1.2GHz; an unofficial benchmarking app, GFXBench 2, revealed that the device actually runs at 500MHz when it’s not juicing. Seems fishy, no?
AnandTech’s findings should in no way effect your final opinion on the Galaxy S4, though it does highlight some shady Samsung tactics. It’s likely the Korean company isn’t the only one to fib benchmarking tests, though; the company is just the one that got caught
Source : http://www.technobuffalo.com/2013/07/30/samsung-caught-manipulating-galaxy-s4-benchmark-results/

Gsmarena clearly stated that Samsung is Cheater?
They played the game with the emotion of common consumer...and ...now i am thoroughly dissapointed by Sammy's behaviour

The manipulation of Benchmarks for me confirms the rumors about the Octacore:
SamMobile News from 30 May 2013
Regards

Tanis64 said:
The manipulation of Benchmarks for me confirms the rumors about the Octacore:
SamMobile News from 30 May 2013
Regards
Click to expand...
Click to collapse
So they released an unfinished chip into the market. Only reason could be marketing. The look at us we have eight core phones angle. People go wow eight is more than four and buy it. So lame.
-- Sent from the mighty Note 2 --

Already mentioned by @AndreiLux few decades ago.
Sent from my iPotato

LegendJo said:
I found that while browsing news so I though I'd share it here.. I'm disappointed because Samsung actually cheating us. :/
Click to expand...
Click to collapse
So is Intel cheating you to when the Intel Core i5-2500k CPU have anormal speed of 3.3 GHz, but have a turbo speed of 3.7 GHz?

Tom-Helge said:
So is Intel cheating you to when the Intel Core i5-2500k CPU have anormal speed of 3.3 GHz, but have a turbo speed of 3.7 GHz?
Click to expand...
Click to collapse
if that speed is only available if your running prime95 or intel burn then....yep.
You can't compare the two, they are totally different.

Tom-Helge said:
So is Intel cheating you to when the Intel Core i5-2500k CPU have anormal speed of 3.3 GHz, but have a turbo speed of 3.7 GHz?
Click to expand...
Click to collapse
No, because Intel didn't mislead anyone. Samsung clearly did.

yeahmann said:
No, because Intel didn't mislead anyone. Samsung clearly did.
Click to expand...
Click to collapse
Really??
Have you not read this research paper?
http://www.abiresearch.com/press/intel-apps-processor-outperforms-nvidia-qualcomm-s
A modified version of AnTuTu was used to cheat in benchmark.
http://www.eetimes.com/author.asp?section_id=36&doc_id=1318857
http://forums.anandtech.com/showthread.php?t=2330027

CLARiiON said:
Really??
Have you not read this research paper?
http://www.abiresearch.com/press/intel-apps-processor-outperforms-nvidia-qualcomm-s
A modified version of AnTuTu was used to cheat in benchmark.
http://www.eetimes.com/author.asp?section_id=36&doc_id=1318857
http://forums.anandtech.com/showthread.php?t=2330027
Click to expand...
Click to collapse
That paper had nothing to do with what he's saying. Turbo is still valid and not misleading.
Secondly, you can blame AnTuTu, not Intel, for that benchmark discrepancy. And then again it was only due to compiler change and no actual cheating per se in the conventional matter.

CLARiiON said:
Really??
Have you not read this research paper?
http://www.abiresearch.com/press/intel-apps-processor-outperforms-nvidia-qualcomm-s
A modified version of AnTuTu was used to cheat in benchmark.
http://www.eetimes.com/author.asp?section_id=36&doc_id=1318857
http://forums.anandtech.com/showthread.php?t=2330027
Click to expand...
Click to collapse
From Exophase:
But frankly, I blame AnTuTu in all of this. They allowed themselves to be manipulated (probably for a price), despite constantly warning against other people cheating their numbers. I don't know if they're displaying a complete lack of integrity or a complete lack of understanding of how their own software works, or something in between the two, but whatever the case I hope they lose all credibility and whatever revenue the program brings them.
Click to expand...
Click to collapse

Related

Games on Android [ARM Mali-400 MP, Nvidia ULP Geforce, PowerVR SGX 543MP2]

The title should actually be GAMING ON ANDROID - mistyped.
There's gonna be a fierce competition in Android Gaming (or hopefully so).
Mobile Manufacturers are chipping in powerful GPUs on their SoCs, which is a very good sign.
It's definitely going to be hard to choose the best gaming mobile from a lot of great mobiles.
Hope this thread will help people to choose their right gaming device.
First of all lets start with the hardware.
As of now, the following hardwares are available for graphics hungry gaming
--> Galaxy S [PowerVR SGX 540]
--> Atrix [Geforce ULP GPU]
--> Optimus 2X [Geforce ULP GPU]
--> Optimus 3D [SGX 540 @300MHZ]
--> Galaxy S II [ ARM Mali 400-MP / Geforce ULP GPU]
Among the above mobiles, the rating is as follows
1 - Optimusx 2x
2 - Galaxy S
3 - Atrix
4 - ARM Mali
* as per anandtech's test with GL Benchmark 2.0 - Egypt
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
And there is weird thing with Geforce ULP GPU, it seems it doesn't support AA/FSAA. Dunno whether it is a problem with GLBencmark application or Driver problem with Geforce ULP... Gotta dig more on this...
The ARM Mali - 400MP is supposed to have 5 times greater performance than SGX540, which it clearly failed. Of course, it has a lot to do with the proper drivers, and this is too early to judge the benchmark scores of SGS II, but still, don't think samsung is gonna do anything on this matter.
A bit off topic, Apple has gone miles ahead with its latest ipad2.
Already it has got great games, and added with this PowerVR SGX 543MP2, it's gonna rock! Guess, again, it'll take years/months to catch up with ipad2's preformance... Hope Adreno 220 or Kal-El will do the magic for android...
Of course, we need to talk about processors too, but i guess the amount of physics involved in current games can be easily handled by the present single ghz processors...
--> Rest of the talk on next post... [Regarding various games available / coming soon]
..................
The Exynos is doing 35fps on Egypt at 1366x768 w/AA in the Odroid-A over at glbenchmark.com. That's much closer to Samsungs promise it seems.
Here's another test: http://androidandme.com/2011/03/new...s-omap4-and-snapdragon-match-up-with-tegra-2/
But they use the same Anandtech numbers as source.
"Scores for the Optimus 3D and Galaxy S II were taken from Anandtech."
landing said:
The Exynos is doing 35fps on Egypt at 1366x768 w/AA in the Odroid-A over at glbenchmark.com. That's much closer to Samsungs promise it seems.
Click to expand...
Click to collapse
"ODROID-A" - that's new... can you put up the link for the benchmark result here...
Edit: ok got it.
http://www.glbenchmark.com/phonedetails.jsp?D=Hardkernel+ODROID-A&benchmark=glpro20
emmarbee said:
"ODROID-A" - that's new... can you put up the link for the benchmark result here...
Edit: ok got it.
http://www.glbenchmark.com/phonedetails.jsp?D=Hardkernel+ODROID-A&benchmark=glpro20
Click to expand...
Click to collapse
so does it mean Mali-400 has similar performance as SGX543 given that the benchmark for odroid-a is comparable to that of i-pad?
^ no way!
SGX543MP2 is way better than Mali-400MP and Geforce ULP (or atleast this is the present scenario/good efficient usage by ios drivers).
I thought Mali-400MP2 was inferior to SGX540 (which shouldn't be supposedly), but it isn't so after all. Most probably by the time SGS II is released, it'll have the drivers that O-DROID has and it'll be on par with the tegra devices in the benchmark...
That is a strange conclusion from looking at the benchmark results. I would say the SGX543 is suppose to be the dominant GPU based on specs but the Mali-400 in Exynos 4210 is really impressing on the games tests and could quite possibly kick SGX543's butt if the resolutions where the same. More results please!
i9100 said:
That is a strange conclusion from looking at the benchmark results. I would say the SGX543 is suppose to be the dominant GPU based on specs but the Mali-400 in Exynos 4210 is really impressing on the games tests and could quite possibly kick SGX543's butt if the resolutions where the same. More results please!
Click to expand...
Click to collapse
after all the original SGS was packed with a superior GPU compare to A4 and i hope Mali-400 will at least be competitive against SGX543 if not on par.
@i9100 & sckc23 - even i was hoping that MALI 400MP is a kickass GPU and samsung was so clever in choosing it (instead of choosing tegra). Cause when all were choosing snapdragon+'lame' adreno, Galaxy S landed with hummingbird and PowerVR SGX540 which could kick ip4's butt....
But the results prove that I'm wrong.
And there's also another reason why Galaxy S won't be a great piece of hardware like IP4 and other android smartphones, because though it has powerful GPU it's lonely in its category. If SAMSUNG funded game developers to code efficiently for GALAXY S and SGS II mobiles, then i'm pretty sure it'll be the best. But that's not the case.
On the other hand, Sony Ericsson and Qualcomm (2 giantshots) are working along with game devlopers for the advancement of their adreno series of GPUs.
So in the end, they'll definitely become the winners.
I guess the game developers should themselves realize the need to code effectively for different mobiles, at least for Galaxy S (the most popular/ Sold mobile) and the SGS II (the gonna be most popular mobile).
emmarbee said:
But the results prove that I'm wrong.
Click to expand...
Click to collapse
Isn't the only result to support your conclusion the single test Anandtech performed? The same result that has been pulled from the GLBenchmark result listing?
But I do agree with you that Samsung needs to step up their developer engagement game if they want to be a leader.
Ya, those results are enough to know that MALI is underpowered. Unlike games these benchmarks will run efficiently on all GPUs. But the reason why MALI is under performed in SGS II and not in ODROID, might be because of the driver factors.
->Anandtech says the drivers play a significant role in making the hardware much better.
->And its upto the developer to fully utilize the hardware and its drivers.
I don't think neither of them is going to better with MALI.
And Apple should've focused on developing an efficient driver for 543mp2 by themselves or should've insisted more on Intrinsity to make it better for iOS.
So you are determined to stick to a conclusion based on likely invalid benchmark data and at the same time dismiss the benchmark itself? I'm sorry, I'm not able to help you then.
Let's try again when there are new SII data available
emmarbee said:
Ya, those results are enough to know that MALI is underpowered. Unlike games these benchmarks will run efficiently on all GPUs. But the reason why MALI is under performed in SGS II and not in ODROID, might be because of the driver factors.
Click to expand...
Click to collapse
It underperfomed on a test version of the phone Anandtech got to play with at MWC - I would take nothing from such a test as an absolute. The ODROID-a test shows the potential for SGSII.
Unlike games these benchmarks will run efficiently on all GPUs
Click to expand...
Click to collapse
Do you mean that the benchmarks are too synthetic or that game developers are sloppy?
->Anandtech says the drivers play a significant role in making the hardware much better.
->And its upto the developer to fully utilize the hardware and its drivers.
I don't think neither of them is going to better with MALI.
Click to expand...
Click to collapse
It seems the drivers for ODROID-A works better, and Samsung would probably get the same drivers. If they are using the same drivers, the score would probably be higher than the ODROID-A, since the glbenchmark tests are very resolution dependent.
And Apple should've focused on developing an efficient driver for 543mp2 by themselves or should've insisted more on Intrinsity to make it better for iOS.
Click to expand...
Click to collapse
Haven't they done that? The benchmarks for the 543MP2 seems quite good to me. And as far as I know, Apple IS developing the OpenGL driver themselves (but probably with help from IMG).
I will say that sometimes its about what you can get out of it not just the specs. The droid and milestone run alot less hardware then most phones these days but with the right rom they run close to on par.
I think the dev community for android is really pushing the hardware to its limits and the phone manu's are just pumping out specs.
duandroid said:
I will say that sometimes its about what you can get out of it not just the specs. The droid and milestone run alot less hardware then most phones these days but with the right rom they run close to on par.
I think the dev community for android is really pushing the hardware to its limits and the phone manu's are just pumping out specs.
Click to expand...
Click to collapse
I do agree with you to a certain extent. I think the HTC Desire offers one of the best user experience in android and I am grateful to the dev community who keep improving it and try to get the best out of the phone. Nonetheless it does lag a lot when multitasking or running games with heavy graphics and as someone who often play games on the phone, i would surely look for one with a much more capable GPU. hopefully be the SGS II=)
how about xperia PLAY ?
ekimaureza said:
how about xperia PLAY ?
Click to expand...
Click to collapse
Xperia play comes with Adreno 205 which is really last-gen so i doubt it can handle heavy graphics. The only advantage is the sliding control pad and SE will probably pay game dev to try optimising their game for the device.
tadjiik said:
It underperfomed on a test version of the phone Anandtech got to play with at MWC - I would take nothing from such a test as an absolute. The ODROID-a test shows the potential for SGSII.
Click to expand...
Click to collapse
Ya, even i was saying that. Samsung should make much better drivers. But will they be doing it? I doubt that. And is SGX540 being utilized to its full potential?
tadjiik said:
Do you mean that the benchmarks are too synthetic or that game developers are sloppy?
Click to expand...
Click to collapse
Usually I don't give a f about benchmark(may be that's because of my assumption that it uses only specs statistics), But GLBenchmark is very very close to real time gaming. Cause i've run the test in my phone and i've found it very real. So i ll definitely consider the score anytime.
And what i meant in that line was, games unlike benchmark are not optimized for all hardware. But the purpose of benchmark is to run efficiently on all hardware and so i said it'll run efficiently on all devices. But hey, thats just my IMHO. (I'm very sorry for not adding it, its all based on some calculations that i've made)
tadjiik said:
It seems the drivers for ODROID-A works better, and Samsung would probably get the same drivers. If they are using the same drivers, the score would probably be higher than the ODROID-A, since the glbenchmark tests are very resolution dependent.
Click to expand...
Click to collapse
Jeez, i totally forgot about the resolution... wow! so i think, if SGS II uses ODROID's drivers, it'll definitely kick MP2 ass? I REALLY WANT THE EXYNOS SGSII TO BE THE ANDROID FLAGSHIP, cause only it has the S-AMOLED-PLUS screen. I don't want to compromise S-A-P for tegra.
tadjiik said:
Haven't they done that? The benchmarks for the 543MP2 seems quite good to me. And as far as I know, Apple IS developing the OpenGL driver themselves (but probably with help from IMG).
Click to expand...
Click to collapse
This is the one, which I TOTALLY miscommunicated. I'm very sorry for my bad English. I was about to say "Apple WOULD HAVE focused on Drivers or WOULD HAVE insisted more on Intrinsity to make it compatible with iOS, thats why they were able to get such good results." S***!, a small change in grammar totally changes the meaning of the sentence.
@i9100 - i don't get you. Will conclude the with the benchmark, if the benchmark was made with final product. Won't mind the benchmark if the product was not tested with final drivers.
@duandroid & @sckc23 - +1
@ekimaureza - xperia PLAY is totally s*** for its price 32,000 INR (while you can get a much better SGS (GT-i9000) for 25,000 INR.
@sckc23 - SE will pay the devs not only to optimize it for those keypads, but also for the phone's hardware itself. Not only SE. It seems Qualcomm is very keen on marketing Adreno GPU after acquiring it. So, even IF the Snapdragon and other Qualcomm (dual cores) with Adreno GPUs are made inferior to PowerVR and ARM Mali, they'll make sure that all games are coded efficiently to run on their hardware.
THE DEVs will think of only one thing - whichever hardware is sold more, they'll code efficiently for it. Cause that'll naturally make their game good on most of the devices.
So though the PowerVR SGX540 (SGS) is better than Adreno 205 (nexus one, desire and a whole lot of other mobiles), developers would've kept adreno 205 while coding as those products have been sold more units than SGS.
The above statments include some real facts, based on logical calculations and some "IMHO"s. So please don't bother to scold me if i'm wrong.

[23/05/12]Tegra 3 Vs A5 vs A5x

I've been looking at alot of material that suggests the Tegra 3 is weak compared to both the A5 as well as A5X,
http://www.anandtech.com/show/5163/asus-eee-pad-transformer-prime-nvidia-tegra-3-review/3
this shows some benchmarks between the prime and the ipad 2
edit 23/05/12:
after much debate I finally got the One X and decided benchmarks are not at all accurate with the difference being in software, OS, hardware, etc. being hard to actually benchmark universal and accurately.
after playing shadowgun, riptide and running plenty of apps, watching the beautiful smooth graphics and tegra 3 technology; who's to say which phone is better or worse, all that matters is that the htc one x performs excellent! and I look forward to more tegra 3 optimized apps and games.
already a thread with lot of information in it here: http://forum.xda-developers.com/showthread.php?t=1544676
mox123 said:
I've been looking at alot of material that suggests the Tegra 3 is weak compared to both the A5 as well as A5X,
http://www.anandtech.com/show/5163/asus-eee-pad-transformer-prime-nvidia-tegra-3-review/3
this shows some benchmarks between the prime and the ipad 2
Click to expand...
Click to collapse
this was discussed till death on this thread
Discussing the performance of the Tegra 3 SoC
and on this thread
HTC One X and One S - HTC's New Hero Devices! Mega Information Thread
i dont see a need for a new thread....
http://www.tuaw.com/2012/03/16/ipad-a5x-cpu-vs-asus-transformer-prime-tegra-3-cpu-benchmarks/
this shows that the tegra 3 is seriously weak compared to the ipad 3,
This just out from Gizmodo
Transformer Prime VS New iPad
gwuhua1984 said:
This just out from Gizmodo
Transformer Prime VS New iPad
Click to expand...
Click to collapse
At least we know that the A5x is not "4x better than" the Tegra chip.
Look more like 20% than 4x. How can they lie like that?
thats weird, because this video shows it is just over 3 x as powerful ???
http://www.tuaw.com/2012/03/16/ipad-a5x-cpu-vs-asus-transformer-prime-tegra-3-cpu-benchmarks/
this is a benchmark of fps between transformer prime vs ipad3 in the gpu arena? am i missing something here? not sure if the tuaw are using the old linpack test which is stated on gizmodo as not being accurate so they didnt carry that one out???
" (Note: We planned to use the age-old Linpack standard test to evaluate CPU performance, but the Linpack apps for iOS and Android are very different, and their results could not be accurately compared.) " gizmodo - http://gizmodo.com/5893970/ipad-test-notes-speed-versus-tegra-3
im not sure like i said above if tuaw are using the old linpack standard test which could be giving false readings and showing the benchmark for ipad to be just over 3x as powerful as the tegra 3 which might be false benchmark figures,
Ipad 3 -> all
Magnesus said:
Look more like 20% than 4x. How can they lie like that?
Click to expand...
Click to collapse
Paid by apple I would assume.
stupid test IMO
i can agree that the a5x gpu is faster than the T3 but the numbers shown at the gl benchmark is totally biased !!
you need to remember that ANY app that is on the app store goes throgh an apple tech guy and is being optimized to the device before its published.
this is not the case with android market where anyone can put whatever they want , and most of the games/apps are not optimized for a single device (btw that is why apple devices are so good the software is optimized with that device)
i am an android dude myself but i do appreciate apple devices , but from a technological POV if the transformer prime were to be optimized with its os, gpu drivers, and games/apps it will be better than the """new ipad"""
d_brimer said:
i can agree that the a5x gpu is faster than the T3 but the numbers shown at the gl benchmark is totally biased !!
you need to remember that ANY app that is on the app store goes throgh an apple tech guy and is being optimized to the device before its published.
this is not the case with android market where anyone can put whatever they want , and most of the games/apps are not optimized for a single device (btw that is why apple devices are so good the software is optimized with that device)
i am an android dude myself but i do appreciate apple devices , but from a technological POV if the transformer prime were to be optimized with its os, gpu drivers, and games/apps it will be better than the """new ipad"""
Click to expand...
Click to collapse
Bull****.
Apple goes through the apps and checks that they work like they should. Apple doesn't tweak or optimize the apps.
The fact is that iOS is better optimized for the hardware it runs on, and since Apple makes both, it's not shocking that their products runs better.
The latest tests shows that the A5X wins at graphical processing while the Tegra 3 wins in calcualtions and memory handling .. and lately they discovered the processing of the A5X and the retina display makes the new iPad get hotter than the iPad 2 by 10 degrees
Sent from my HTC Vision using xda premium
LordManhattan said:
The fact is that iOS is better optimized for the hardware it runs on, and since Apple makes both, it's not shocking that their products runs better.
Click to expand...
Click to collapse
I thought Samsung made the hardware, well at least the processor?
ZDNet compared iPad3 and Transformer Prime
ZDNet is quite inconclusive, comparing two games, namely Shadowgun and riptide, in a Tegra3 and an A5X device. While the iPad3 has a little crispier colors and text display (thanks to its Retina display), the Prime displays some special graphics effects not to be seen on the iPad3 (yet). So in the end it depends on the other hardware components and the software optimization I guess..
Tegra needs to out do apple
sickorwuut said:
Tegra needs to out do apple
Click to expand...
Click to collapse
Everyone needs something, doesn't mean they'll get it. Though I am on a G2x and believe the Tegra 2 out does the 4s, that iPad is powerful, if only we could slap android on that hoe.
And to people saying that iOS is better optimized for the hardware, you don't think companies just slap Android on these phones right? Some phones seem like it though which is what gives android a bad name for some. But then some custom ROMs come around and can fix a whole wave of problems for phones (especially my G2x).
Sadly though, I don't think NVidia will ever release Tegra 2 drivers for ICS G2x.
Sent from Narnia
LordManhattan said:
Bull****.
Apple goes through the apps and checks that they work like they should. Apple doesn't tweak or optimize the apps.
The fact is that iOS is better optimized for the hardware it runs on, and since Apple makes both, it's not shocking that their products runs better.
Click to expand...
Click to collapse
They won't release apps that have accurate benchmarks though and will show that their products aren't so revolutionary. I mean seriously, I like apple, but hate their "Somebody did it first but since we did it months later it's revolutionary," mindset.
Sent from Narnia
lol wow....
Apple like to lie for customer

Exynos UI Snappier than S600 or placebo effect?

After seeing dozens of vids i started to notice in Russians vids (Exynos Version) UI transitions seems to be more snappier than vids using S600 version.
In russians vids all in S4 seems pretty smooth and fast (like here and other vids: http://www.youtube.com/watch?v=2ymXQHSxOh4).
In s600 vids seemed to me have seen something clawing here and there.
Someone more have notice this or is just a placebo effect i have.
Thx.
I'm dying to see 2 versions face to face.
.
Placebo it's probably using the a7s anyway when UI browsing
S4 INFO
TingTingin said:
Placebo it's probably using the a7s anyway when UI browsing
S4 INFO
Click to expand...
Click to collapse
That may be true, but the GPU is more power full so it could be faster due to that. I don't know, they both look good to me.
Also dont know, as i said after seeing many videos I found myself looking preferably Russians videos because what i saw in their S4 screens looks better to me than no Russians vids, although in beginning didnt associated they use Exynos version.
I hope is placebo effect because here we only have S600 version....arrhhhh SAM.
wish it could've been subtitled. I couldn't understand at all.
Sent from my GT-I9300 using xda app-developers app
It's possible. Neither has the final SW out and Samsung has been working on the Exynos version before they realized they'll need the Snapdragon to meet all the demand. How quickly they managed to adapt with all features to a different SoC only means they'll improve it further upon launch and then some, but I can't say I've seen any lag on either version in videos.
The only thing's bothering me is the home button lag cause it waits for double input, can you turn it off?
Also, is there a way to make AirView pop-ups instantaneous like mouse hover instead of waiting a second over the content?
BoneXDA said:
I
The only thing's bothering me is the home button lag cause it waits for double input, can you turn it off?
Click to expand...
Click to collapse
Yes, atleast on the s3 you can disable home button for S-voice
deleted by me.
my fault.
pack21 said:
Rude and unhelpful, when i dont understand what someone is trying to say i dont do such kind of comment.
Click to expand...
Click to collapse
I don't think there was any intent to be rude in his statement. There is a beta caption translator built into YouTube which gives you a rough idea what he is saying if you enable it.
Still delaying my preorder/early upgrade because of this (although most likely placebo) and other reasons. Really hope the i9500 someone makes it here to the UK so I can get rid of my temp-phone BlackBerry! :crying:
darrendm said:
I don't think there was any intent to be rude in his statement. There is a beta caption translator built into YouTube which gives you a rough idea what he is saying if you enable it.
Still delaying my preorder/early upgrade because of this (although most likely placebo) and other reasons. Really hope the i9500 someone makes it here to the UK so I can get rid of my temp-phone BlackBerry! :crying:
Click to expand...
Click to collapse
Thanks, I hadn't realized that.
At first glance it seemed to be referring to my post, @emylia777 might have added that was referred to youtube vid.
@emylia777, In that case i apologize.
4ktvs said:
That may be true, but the GPU is more power full so it could be faster due to that. I don't know, they both look good to me.
Click to expand...
Click to collapse
The Adreno 320 should be similar performance to the SGX544MP3 on the Exynos. Both of the chips GPU are clocked higher than stock, we know the SGX544MP3 is at 533Mhz. Plus in theory the Adreno 320 still should perform better due to better intergration in the GPU. The Adreno 320 has more to offer than the SGX544MP3 in the long run such as OpenGL 3.0 and Open CL 1.0, which should make the UI smoother.
Anyways, I won't be surprised in the Exynos version is more optimized, the A7 CPU is a lot slower than the Snapdragon 600, so they'll have to make it optimize for the Exynos chip than the S600.
With 2 GBs of RAM, who knows how many things are running and what is on the priority list. Those are things we don't get to see in these videos.
I wonder if lets say having 5 memory intensive apps open would bog down the CPU priority list and slow the device down a bit.
First comparative says the Octa version delivers 10% better benchmark performance to the Snapdragon 600 variant
......
and better battery life.
http://www.galaxy-s4.info/galaxy-s4...os-5-octa-detailed-review-by-russian-websites
.

Don't bother with battery comparisons on the i9500, the phone is unfinished.

So I got my i9500 and already did some foolery with it.
Fine device, but I hate the raised lip around the screen edge. Something I definitely did not miss on the S3 and something very annoying.
Other than that small design critique:
THE ****ING PHONE ISN'T RUNNING FINAL FIRMWARE!
Basically the CPU is running on the cluster migration driver, meaning it switches all four cores from the LITTLE to the big cluster, as opposed to the core migration driver who does this in an individual core-pair manner.
You can pretty much throw all battery comparisons out of the window: it's completely unfinished and unoptimal.
I already compiled the kernel and flashed it without the cluster migration tidbit, but the phone won't boot. So yea. Current sources also useless.
Cleverly enough: you can't really distinguish between the two drivers apart from one manner: if /sys/devices/system/cpu/cpufreq/iks-cpufreq/max_eagle_count is present, you're running an IKS driver. If it's not, then you're running the sub-optimal IKCS driver.
So yea. We'll see what Samsung does about this, currently the advantages of big.LITTLE are pretty much unused.
Another nail in the coffin on how rushed and unprepared this phone has been.
Wow, this is seriously turning out to be a fiasco.
ChronoReverse said:
Wow, this is seriously turning out to be a fiasco.
Click to expand...
Click to collapse
This is EXACTLY why at the end I don't care for technical details about socs but was rather waiting for real world usage first. As much I wanted to agree with Andrei Lux on how intelligent BigLittle is, I sort of felt that it wont be same at the end.
Question is now: Is this possible to fix in the near future?? So that maybe buying the Exynos will be beneficial if the devs take over. I wont bet on Samsung introducing mind-blowing improvements in that department in upcoming firmwares
Xdenwarrior said:
Question is now: Is this possible to fix in the near future?? So that maybe buying the Exynos will be beneficial if the devs take over. I wont bet on Samsung introducing mind-blowing improvements in that department in upcoming firmwares
Click to expand...
Click to collapse
The code other driver is there in the kernel, it's just not used. No idea. It's not like we need Samsung for it: I already talked to a developer at Linaro about some incomplete switcher code that's being currently getting the green-light to be made public. But who knows how long that will take.
Whatever the case, I gather that they can't just let it be in the current state.
AndreiLux said:
The code other driver is there in the kernel, it's just not used. No idea. It's not like we need Samsung for it: I already talked to a developer at Linaro about some incomplete switcher code that's being currently getting the green-light to be made public. But who knows how long that will take.
Whatever the case, I gather that they can't just let it be in the current state.
Click to expand...
Click to collapse
Any way to just disable cortex a15 altogether yet just to see how well cortex a7 will perform in simple texting, browsing, calling and to see what the battery life will be like on that?? (cause cortex a7 only uses like 200 something mw as opposed to 1000mw for snapdragon). I know u wont be able to game. How often does Cortex A15 hits in? cause I would suspect a much worse battery life with incomplete drivers doing the switching if its very often on. But PocketNow reports very similar battery results to snapdragon variant which I find odd
Xdenwarrior said:
Any way to just disable cortex a15 altogether yet just to see how well cortex a7 will perform in simple texting, browsing, calling and to see what the battery life will be like on that?? (cause cortex a7 only uses like 200 something mw as opposed to 1000mw for snapdragon). I know u wont be able to game. How often does Cortex A15 hits in? cause I would suspect a much worse battery life with incomplete drivers doing the switching if its very often on. But PocketNow reports very similar battery results to snapdragon variant which I find odd
Click to expand...
Click to collapse
Use any app to limit the CPU frequency to 600MHz. That'll limit it to the A7 cores running to 1200MHz. Basically you can just use CPU-Spy. Everything <= 600 are A7's mapped at half frequency, everything above it are A15's at 1:1 frequency.
As for PocketNow: irrelevant. The difference is what could be instead of what is, the Snapdragon doesn't play a role in the discussion here.
WOW , thats sucks
Samsung was too rushed and ruined it :/
AndreiLux said:
Use any app to limit the CPU frequency to 600MHz. That'll limit it to the A7 cores running to 1200MHz. Basically you can just use CPU-Spy. Everything <= 600 are A7's mapped at half frequency, everything above it are A15's at 1:1 frequency.
As for PocketNow: irrelevant. The difference is what could be instead of what is, the Snapdragon doesn't play a role in the discussion here.
Click to expand...
Click to collapse
Hey thanks, but I don't have the S4 to test it with since i'm still debating on which to get. I live in Canada and so the only version here which I can get a lot cheaper on a contract is LTE snapdragon, but I wont mind getting the Exynos since it got potential. Besides 16GB internal isn't enough for me. So that's why asking if u seen any improvements in battery when only cortex a7 ran? If a7 doesn't do much in power consumption, then no point spending 800 bucks and loosing LTE altogether...
@bala_gamer please see my PM its important...
Sent from my GT-I9500 using xda premium
Oh wow. Just got word (without further in-depth explanation) that this might actually be a hardware limitation. Coming from a reliable source.
No words...
AndreiLux said:
Oh wow. Just got word (without further in-depth explanation) that this might actually be a hardware limitation. Coming from a reliable source.
No words...
Click to expand...
Click to collapse
Can you elaborate a bit more pls?
Sent from my GT-I9500 using Tapatalk 2
that's not what samsung exynos advertised..
http://www.youtube.com/watch?v=U6UNODPHAHo
Is it possible that we're having a simpler Exynos 5 system technically closer to Exynis 5 Quad (plus 4 A7 cores) than a real seamless Octa-core system? It was strange reading that "Octa-core manufacturing starts in Q2" (April-June) then see Octa-core versions hitting reviewers early April, that's way too low time frame. Maybe this is a 1st-gen 5410. In any case, performance and current-state battery life beats the Snapdragon version, even if only just.
AndreiLux said:
Basically the CPU is running on the cluster migration driver,
Click to expand...
Click to collapse
wtf? Well done Samsung... This is ridiculous...
AndreiLux said:
Oh wow. Just got word (without further in-depth explanation) that this might actually be a hardware limitation. Coming from a reliable source.
No words...
Click to expand...
Click to collapse
WHAT THE [email protected]??!!
Actually WTF is a massive understatement here....!!!
Please can you give more info about this matter whenever is possible? This is very serious...
Is it a specific hardware limitation? Something that Samsung specificly did in GS4 (I9500) ?
Because this can't be a generic exynos octa limitation. It makes no sense... Unless everything we've read from Samsung and ARM about exynos octa, are completely misleading...
A hardware limitation..? They advertised the functionality and to then release a device without it, is just plain stupid. Hopefully it is a just a kernel issue and can be resolved quickly.
Sent from my Galaxy Nexus using xda premium
Probably Samsung will implement it in their Note 3 device? It's a conspiracy so that people buy their next Note phone but this news is sad.
Sent from my GT-I9100 using Tapatalk 2
Now what is this all about? Is this a very serious issue?
So its either all A15s or all A7s?
so would the 'octa' really be a better choice than the S600? That should be powerful enough.. and the S600 is pretty power efficient too
rkial said:
So its either all A15s or all A7s?
so would the 'octa' really be a better choice than the S600? That should be powerful enough.. and the S600 is pretty power efficient too
Click to expand...
Click to collapse
What I understood is its either the full cluster of a7 or a15 is used/ functional based on the load, dynamically turning on one or two cores of a15 to work along with a7 may not be possible it seems.
I may be wrong, waiting for an elaborate exp from andrei
Sent from my GT-I9500 using Tapatalk 2
bala_gamer said:
What I understood is its either the full cluster of a7 or a15 is used functional based on the load, dynamically turning on one or two cores of a15 to work along with a7 may not be possible it seems.
I may be wrong, waiting for an elaborate exp from andrei
Sent from my GT-I9500 using Tapatalk 2
Click to expand...
Click to collapse
I was always under the impression this was the intention of Samsung's particular implementation of it. I thought it was common knowledge that Samsung's version worked on a 4 or 4 (A15) or (A7) basis.
Maybe he was talking about the ability to change that.

No True Octa Core(HMP Update) For Exynos Version.

Sad News:
Sammy exec confirmed this:
http://www.gsmarena.com/samsung_galaxy_s4_and_note_3_wont_get_true_octacore_update-news-6908.php
No wonder that they released videos of hmp on 5420 instead of note 3.
http://www.youtube.com/watch?v=Zwbeb08W27U&feature=youtu.be&ism=SASep1513Facebook1
This is cheating but still that is not going to affect my descision to buy note3.
What do u guys think?
Yet people are still buying and will be buying samsung devices.
Oh noez, no 8 cores in your phone? How will we survive? Seriously, at some point cores will become ubiquitous, nobody will really care once the OS and software just uses available resources. We're getting closer, but I doubt your phone is doing anything that'd really require 8 cores to do.
khaytsus said:
Oh noez, no 8 cores in your phone? How will we survive? Seriously, at some point cores will become ubiquitous, nobody will really care once the OS and software just uses available resources. We're getting closer, but I doubt your phone is doing anything that'd really require 8 cores to do.
Click to expand...
Click to collapse
Well HMP/GTS is not jus running all 8 cores at a time its about firing up the required number of cores in any combination of a7's and a15's currently afaik it cluster migration ( the worst implimation of big.littile).
jsriz said:
Well HMP/GTS is not jus running all 8 cores at a time its about firing up the required number of cores in any combination of a7's and a15's currently afaik it cluster migration ( the worst implimation of big.littile).
Click to expand...
Click to collapse
I think Note 3 is running core migration, which is much better than cluster migration.
system.img said:
I think Note 3 is running core migration, which is much better than cluster migration.
Click to expand...
Click to collapse
Wrong.Note 3 came with cluster migration.
The Note 3 is still running cluster migration and I'll doubt this will change anytime soon. Their drivers are still out of date for normal IKS so I won't even bother trying to get that running on a device I don't own. And frankly nobody else is interested in doing the work.
Click to expand...
Click to collapse
ff - sorry can't delete
Mod Edit
Duplicate thread is closed
Original is HERE
malybru
Forum Moderator

Categories

Resources