Question S22 camera [Suspicious moon photos] - Samsung Galaxy S22 Ultra

Hey gang,
I would love a developer or anyone in the know to investigate this potential bull**** moon filter that kicks in automatically when you zoom in on the moon. This first came to my attention on the Note 9. Basically it looks like the camera app is detecting the moon phase, then after a bright flash it loads the appropriate phase image.
While zooming in, the light and fuzziness just goes away and you have this clear image and no light, no stars or clouds, just this prefect moon shot. Tonight o tested through a dirty window and I got the same clear photo. I took two and noticed they are different hues between the shots. So.... anyone already know what's up or care to investigate? Maybe there's some moon phases photos in a directory somewhere.
If you need anything from me please hit me up.
Thank you
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}

I remember reading posts from Samsung users claiming the moon pics from Samsung phones are very much real, unlike the ones from Huawei and other Chinese brands.
I'm not an expert on the matter because I never cared much about this stuff, personally I always thought all brands use some sort of AI for the moon shots.

The moon is super bright, when I focus my DSLR with a zoom lens on the moon I can't see any stars as well. To photograph the moon the camera uses daylight shutter settings.

ryant35 said:
The moon is super bright, when I focus my DSLR with a zoom lens on the moon I can't see any stars as well. To photograph the moon the camera uses daylight shutter settings.
Click to expand...
Click to collapse
Oh ****, on a DSLR too huh? Maybe this is just how it works then. Thank you

JAH0707 said:
Oh ****, on a DSLR too huh? Maybe this is just how it works then. Thank you
Click to expand...
Click to collapse
At night, the moon is the brightest object in the sky. The human eye can compensate for the difference which allows us to see stars and moon at the same time. Most DSLR cameras and all phone cameras are not sophisticated enough to allow both stars and the moon to be visible. Pretty much every moon shot where you see both the moon and the stars are composite shots. Where the photographer exposed for the moon then exposed for the stars. Then take both shots and overlay one on the other.

There was an article written when the s21u was first released and they compared the s21u moon shots to shots taken using a DSLR of the same time. They found that the s21u moon shots were real and there was no overlay used.

JAH0707 said:
Hey gang,
I would love a developer or anyone in the know to investigate this potential bull**** moon filter that kicks in automatically when you zoom in on the moon. This first came to my attention on the Note 9. Basically it looks like the camera app is detecting the moon phase, then after a bright flash it loads the appropriate phase image.
While zooming in, the light and fuzziness just goes away and you have this clear image and no light, no stars or clouds, just this prefect moon shot. Tonight o tested through a dirty window and I got the same clear photo. I took two and noticed they are different hues between the shots. So.... anyone already know what's up or care to investigate? Maybe there's some moon phases photos in a directory somewhere.
If you need anything from me please hit me up.
Thank you
View attachment 5658259 View attachment 5658261 View attachment 5658263 View attachment 5658265
Click to expand...
Click to collapse
Photographing stars with good detail takes about 20-30 seconds. A telephoto pic of the moon takes only a fraction of a second since it's so bright so you won't see much of any star detail in the background.

Funny I just took a couple photos the other day and thought the same, then I saw this post

Huawei fakes it for sure and OP's theory is spot on for Huawei.
Samsung - Well, I didnt expect that from Samsung,

JazonX said:
Huawei fakes it for sure and OP's theory is spot on for Huawei.
Samsung - Well, I didnt expect that from Samsung,
Click to expand...
Click to collapse
Samsung isn't faking it. Here is a shot I took a few weeks ago. I had to do some Lightroom work.
I've taken enough pictures of the moon to know.

If you really need to go down the rabbit hole, here is the article that confirms Samsung is not faking it unlike Huawei. - https://www.gsmarena.com/myth_debunked_samsungs_100x_zoom_doesnt_fake_moon_photos-news-47487.php

A few months late, but there is a video of someone comparing the S22's telephoto capabilities with those of the Nikon P1000 and he did confirm that the S22 applies a filter to the moon with the 30x zoom. It's pretty obvious once you see the difference side by side. If you watch the video you'll see it relatively clearly. (Just look for S22 vs P1000 on YouTube, creator is named Versus, skip to 5 minutes)
The best method to test it out for yourself is to take a photo of the moon through some treebranches. Some people tried this and got the filter to only partially cover the moon this way. 2 of the pictures posted here also have this smooth artificial look, but it seems like the AI got the colors right.

Lunar Ellipse take 1.5 months ago in Tokyo

Took this last night and did a remaster on it.. noticed a square like the moon was pasted on top of the black background... looks faked to me... 

The original picture and then remastered.

Quite possibly the reason it's "debunked" is because people that are trying to debunk it are taking pictures not pointing in the direction of the moon.. the phone does have sensors in it which let it know what direction it's facing, where its looking and so on.. so if you're taking pictures of a ping pong ball in your house... the phone knows you aren't pointing it in the actual direction of the moon..

This is interesting, this would be an awesome conspiracy. It wouldn't be too hard to debunk: find a small circular object that would match the size of the moon and brightness (use an led or a pen light), use full zoom mode, aim the phone exactly where the moon should be.

Shot some days ago with my 22U/Exynos (edited with Lightroom on Android)
Just stepped out of the door, seen the moon, seen where the plane was flying, and took my chance

phatmanxxl, could probably spoof the phones sensors to trick it into thinking it's looking at the moon.. not sure what sensors would need to be spoofed but I'm guessing gps would be one of them.

Related

Better camera app for panoramic shots on HD2?

is there a camera app that has a better panoramic feature than the one that's already built into the HD2? the one that's with the factory camera app doesn't stitch together well. Auto stitching would be great.
Agree on that, "autostitching" should realy be standard. Almost impossible to make a perfect panoramic photo atm...
I really hope someone can make an app with auto stitching cause the HD2 takes beautiful pictures. So having a panoramic feature that actually works right would make it 100x better
Best solution is take individual photos that overlap, transfer to PC, then make panoramas with one of the many progs available for desktop.
I concur. In fact, the best panorama software I had the pleasure to use was (believe it or not) windows live photo gallery.
Here are some examples I had shot with my Canon 720IS...
I challenge you to find any seems, the image was made of 8 images stitched together.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
chvvkumar said:
I concur. In fact, the best panorama software I had the pleasure to use was (believe it or not) windows live photo gallery.
Here are some examples I had shot with my Canon 720IS...
I challenge you to find any seems, the image was made of 8 images stitched together.
Click to expand...
Click to collapse
Oh, that's easy to find seems in that picture
Bottom left, something wrong with the tiles on the floor and the handrail/(other english expression for it).
Also, look at the trees from right to left, you'll notice something on the far right side, colours are different too
But that's a pretty good result for Windows Live Photo Gallery!
I still use Photomerge in Photoshop for making panoramas though.
I haven't seen any camera program on a mobile phone do some good stitching and I don't think they are capable of stitching a XX MPix-Pano together well...
I'd be happy if anyone could prove me wrong on this!
Regards,
Lukas
xILukasIx said:
Oh, that's easy to find seems in that picture
Bottom left, something wrong with the tiles on the floor and the handrail/(other english expression for it).
Also, look at the trees from right to left, you'll notice something on the far right side, colours are different too
But that's a pretty good result for Windows Live Photo Gallery!
I still use Photomerge in Photoshop for making panoramas though.
I haven't seen any camera program on a mobile phone do some good stitching and I don't think they are capable of stitching a XX MPix-Pano together well...
I'd be happy if anyone could prove me wrong on this!
Regards,
Lukas
Click to expand...
Click to collapse
You got me!!! (though, I still can't seem to find the tree anomaly that you speak of )
As for the colour correction, the source images did not go through any post processing, they were straight from the camera (in full manual with preset white balance, exposure compensation, shutter speed, aperture as well as ISO) I guess the colour variation might be the result of the weird angle of the sun
But, I guess that would be 'good enough' for 99% of users here.
Also, if you are using a mobile for creating panoramic images, that would be the least of your problems.
My wifes iPhone has pretty good app. It's called "auto stitch". It actually works pretty well. Oh well i guess if there's no mobile app for the HD2 I guess I will make panaramic images with PhotoShop. Would be nice to just have it done right on the phone.
Are you looking for something that shows you markers whilst you take the pic, or just something that can stitch pics together?
I think one of the problems is HTC's camera driver (which you'd need access to to write such an app) is a bit of a closed cupboard for developers It's easy in code to access the images after they've been taken (so it would be possible to have an app thatt could stitch existing photos), but not easy (impossible?) to access the real-time video feed bit so that you could overlay stitching markers whilst taking the pics. If you do it the way you're meant to in code I think you get a really low-res version of it from what I've read. Probably HTC being lazy (getting it to work for them but not for any developers). Shame as I reckon we'd have a few iphone-esque augmented reality type apps if things were a bit clearer on that front
James
I found a way to get pano pic´s on the HTC HD2!
Just install this cab... i tried to put it to work and it didn´t worked... I uninstalled it and after it keep on working fine...
I tried the above cab on my HD2
and in panarama mode it takes a very light/bright first image then freezes and reboots the phone in about 30 seconds. Just FYI.
Strange, because i having enabled panoramic photo function by my own registry mod and this mode working very good (of curse best results are when we set light, ISO etc. manualy).
Unfortunely panoramic photos maked in this mode having low resolution, so i suggest is better take few normal photos in different sides and join in some piece of software in PC, because this give much better results.

【Huawei Mate20】Huawei AI Photography is amazing

I wonder to know that would you all like to take photoes ? And what is the reason?
If you ask me, I would like use a German proverb to explain: "The only thing that happened once just like they had never happened." The meaning of photoes may be that they can provide evidence of existence for things that have only happened once.However, in most cases, the color in the mobile phone is far from the sight seen in the eyes ,leading to that the pictures can not fully express our state of mind at that time.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
I have tried a lot of mobile phones, but the color performance is not satisfactory. So this time I used the AI camera mode specially set up by the new Huawei mobile phone. After taking a few photos, I found out that it is really perfect .
1. Blue sky
people may take photoes of the blue sky and white clouds when they are traveling or in good mind, However, the color of sky in the mobile phone freqently far from the scenery seen in our eyes . with the AI mode in Huawei emerge , the clarity of the image is significantly increasing , making the sky look more clear and transparent.
HuaweiAI( Picture 1) Note 9( Picture 2) Iphone X( Picture 3)
2. Cute pet
People may take a lot of pictures of their pets to recored the precious time, but it is diffcult to show the real colour of pets, the pictures took by Huawei AI mode look alive, let us see the comparison.
HuaweiAI ( Picture 1) Note 9( Picture 2)
3. Beautiful flowers
Coloueful Flowers are difficult to display in mobile phone, the pictures took by Huawei AI is the most beautiful. It looks like that The flowers are flowers are ready to come out. The leaves are also more green and transparent.
HuaweiAI( Picture 1) Note 9( Picture 2) Iphone X( Picture 3)
4. A beautiful underwater world
The pictures taken under water by mobile phone are generally not very good, the reason is the photos lack color levels, and it is easy to be large gray or blue-green areas. Mobile phone manufacturers have added a color adjustment mechanism to improve the effect of underwater photography. In contrast, the effect of Huawei AI's camera is relatively better. The overall brightness of the underwater picture is improved, the definition is better, and the color is more full.
HuaweiAI ( Picture 1) Sumsung S8 ( Picture 2) Iphone X( Picture 3)
Taking photoes to remember our lifes is wonderful, It is not easy to express our mood clearly from the photos. But useing the Huawei AI camera mode may be a good choice, it can present a more beautiful and clearly world,when you looks back , the colorful and clearly pictures will definitely bring you back to the mood of that moment.
dazzleshining said:
I wonder to know that would you all like to take photoes ? And what is the reason?
If you ask me, I would like use a German proverb to explain: "The only thing that happened once just like they had never happened." The meaning of photoes may be that they can provide evidence of existence for things that have only happened once.However, in most cases, the color in the mobile phone is far from the sight seen in the eyes ,leading to that the pictures can not fully express our state of mind at that time.
I have tried a lot of mobile phones, but the color performance is not satisfactory. So this time I used the AI camera mode specially set up by the new Huawei mobile phone. After taking a few photos, I found out that it is really perfect .
1. Blue sky
people may take photoes of the blue sky and white clouds when they are traveling or in good mind, However, the color of sky in the mobile phone freqently far from the scenery seen in our eyes . with the AI mode in Huawei emerge , the clarity of the image is significantly increasing , making the sky look more clear and transparent.
HuaweiAI( Picture 1) Note 9( Picture 2) Iphone X( Picture 3)
2. Cute pet
People may take a lot of pictures of their pets to recored the precious time, but it is diffcult to show the real colour of pets, the pictures took by Huawei AI mode look alive, let us see the comparison.
HuaweiAI ( Picture 1) Note 9( Picture 2)
3. Beautiful flowers
Coloueful Flowers are difficult to display in mobile phone, the pictures took by Huawei AI is the most beautiful. It looks like that The flowers are flowers are ready to come out. The leaves are also more green and transparent.
HuaweiAI( Picture 1) Note 9( Picture 2) Iphone X( Picture 3)
4. A beautiful underwater world
The pictures taken under water by mobile phone are generally not very good, the reason is the photos lack color levels, and it is easy to be large gray or blue-green areas. Mobile phone manufacturers have added a color adjustment mechanism to improve the effect of underwater photography. In contrast, the effect of Huawei AI's camera is relatively better. The overall brightness of the underwater picture is improved, the definition is better, and the color is more full.
HuaweiAI ( Picture 1) Sumsung S8 ( Picture 2) Iphone X( Picture 3)
Taking photoes to remember our lifes is wonderful, It is not easy to express our mood clearly from the photos. But useing the Huawei AI camera mode may be a good choice, it can present a more beautiful and clearly world,when you looks back , the colorful and clearly pictures will definitely bring you back to the mood of that moment.
Click to expand...
Click to collapse
The Cat body is blurred, I prefer the other one.
Enviado desde mi EVR-L29 mediante Tapatalk
Kinda preaching to the choir here friend. We know the M20s are great but so are the others too.

Question Camera trouble

{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
So can anybody help with this disaster?
Severely cropped to not nuke the servers.
The more distant classic sunset was fine, but then we got an amazing and huge red sun just before it set. The whole ball was the colour of the edge or the stripe in this shambles of a shot.
Other than turn off AI what else should I do?
I have to say my old Mate 40 RS Porsche Edition would have got the shot no problem. Very disappointed in my new S22 Ultra.
This is one of those possibly never see again chances, gutted.
This is due to Infra Red blooming. The sensors are very sensitive to IR, so there is an IR blocking filter in the system. Unfortunately, this does not block 100% of the IR on bright objects like the Sun. You can see this effect by taking a shot of a hot electric stove burner with the camera; you don't even have to take a picture to see the effect just look at the screen. I've attached a shot here, the stove was not a bright pink to the eye, that is due to IR leaking through the filter. Because IR has a longer wavelength than Red, the light is focused by the lens farther away, and it can cause a bloom around the Sun like that. Other cameras have the same problem, and it depends on the strength of the IR filter. There are disadvantages to making the IR filter stronger like color distortion in the other colors so it is always a compromise. There isn't much that can be done about it.
edit - I have to add that taking a picture of the low Sun like that is just about the worst case and you probably won't notice it anywhere else. When the Sun is low and most of the light is being blocked so you can actually stand to look at it (still not good to stare), almost all of the IR light is still getting through, and so the IR component is a lot more than usual. You can even get the same effect on a pro DSLR.
brachiopod said:
This is due to Infra Red blooming. The sensors are very sensitive to IR, so there is an IR blocking filter in the system. Unfortunately, this does not block 100% of the IR on bright objects like the Sun. You can see this effect by taking a shot of a hot electric stove burner with the camera; you don't even have to take a picture to see the effect just look at the screen. I've attached a shot here, the stove was not a bright pink to the eye, that is due to IR leaking through the filter. Because IR has a longer wavelength than Red, the light is focused by the lens farther away, and it can cause a bloom around the Sun like that. Other cameras have the same problem, and it depends on the strength of the IR filter. There are disadvantages to making the IR filter stronger like color distortion in the other colors so it is always a compromise. There isn't much that can be done about it.
edit - I have to add that taking a picture of the low Sun like that is just about the worst case and you probably won't notice it anywhere else. When the Sun is low and most of the light is being blocked so you can actually stand to look at it (still not good to stare), almost all of the IR light is still getting through, and so the IR component is a lot more than usual. You can even get the same effect on a pro DSLR.
View attachment 5573961
Click to expand...
Click to collapse
Many thanks for the detailed reply. So turning off the AI assist stuff wouldn't have helped then. Anything in Pro mode or Expert Raw that could help?
stewarta13wsb said:
Many thanks for the detailed reply. So turning off the AI assist stuff wouldn't have helped then. Anything in Pro mode or Expert Raw that could help?
Click to expand...
Click to collapse
Probably not, since it is an optical effect. You might be able to 'shop it out. On a pro camera, you can use a very small aperture which brings the IR light more into focus, but that's not an option on the phone which has a fixed aperture. I've got a very similar shot from my old Note 20 U that had the same problem. Below is a cropped in shot from an Olympus mirrorless camera that shows the same Red ring thing, just to a smaller degree. It isn't helping that the "100x" zoom works by cropping the 10X image, it amplifies small defects like that.
stewarta13wsb said:
Many thanks for the detailed reply. So turning off the AI assist stuff wouldn't have helped then. Anything in Pro mode or Expert Raw that could help?
Click to expand...
Click to collapse
Only a stronger UV/IR cut filter would help.
I shoot IR/UV and full spectrum with a DSLR and RAW video cameras... Its a balance between visible light and correct look with most sensors.
Unless you use (physical) filters, nothing you can really do. Most cameras are very sensitive to IR light, and blocking IR/near IR completely wipes out a lot of deep reds and throws off colors as well. Sensors do not see at all like our eyes do.
Channel swapped 590nm IR shot attached.

General Google Pixel 7 Pro Camera Samples

Google shares camera samples of Pixel 7 and Pixel 7 Pro
Macro Focus on Pixel 7 Pro
18 new items · Album by Alexander Schiffhauer
photos.google.com
Night Sight on Pixel 7 & Pixel 7 Pro
8 new items · Album by Alexander Schiffhauer
photos.google.com
Photo Unblur
16 new items · Album by Isaac Reynolds
photos.google.com
Real Tone on Pixel 7 & Pixel 7 Pro
16 new items · Album by Alexander Schiffhauer
photos.google.com
Super Res Zoom on Pixel 7 Pro
27 new items · Album by Alexander Schiffhauer
photos.google.com
Cinematic Blur
5 new items · Album by Isaac Reynolds
photos.google.com
Looks great!
What happened with the second Macro Focus picture? The one underwater? Look at the left hand, and the thumb of the right hand.
It's either an elaborate joke, or maybe a sign that Google manipulates/edits the photos and forgets about fixing botched edits, or the software algorithm for macro is just broken. The picture has obvious problems, it's "botched" - I'm surprised that nobody over at Google noticed and that they actually released this?
Not to mention that the algorithm tried to create a nail on the underside (!) of the index finger of the left hand, which ofc is also botched/broken, it appears that the algorithm is not capable of detecting up/downside of the hand properly.
Edit: included the picture in question from the above album
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}

			
				
Morgrain said:
Wtf happened with the second Macro Focus picture? The one underwater? Look at the left hand, and the thumb of the right hand.
It's either an elaborate joke, or maybe a sign that Google manipulates/edits the photos and forgets about fixing botched edits, or the software algorithm for macro is just broken. The picture has obvious problems, it's "botched" - I'm surprised that nobody over at Google noticed and that they actually released this?
Not to mention that the algorithm tried to create a nail on the underside (!) of the index finger of the left hand, which ofc is also botched/broken, it appears that the algorithm is not capable of detecting up/downside of the hand properly.
Edit: included the picture in question from the above album
Click to expand...
Click to collapse
You do realize that the hands are under water?
Could you mark the places where you think there is something wong?
faxteren said:
You do realize that the hands are under water?
Could you mark the places where you think there is something wong?
Click to expand...
Click to collapse
I guess that's what he is talking about. But I would say that's caused by the sun and the water.
1. Distortion of the skin, position and shade of the bubble doesn't make any sense. The skin is seemingly stretching because of the bubble, even though that's ofc impossible. A bubble would float above the skin, and only cause a minor shade on the skin, if photographed from above.
2. see above
3. The left thumb is completely distorted?! The left side of the thumb is obviously botched. The thumb stretches unnaturally around the bubble (that's not what thumbs do).
4. Are we blind? Deploy the garrison! I'm surprised that I have to explain this one. It's obvious. A titanic portion of the right thumb is simply... missing. Gone. Reduced to atoms. As I said, either it's a hilariously bad manual edit job (gone wrong), or the algorithm is broken and can't handle all the reflection underwater. (also notice how, where the thumb is missing skin, this skintoned/pink-ish "thingy" (looks a bit like a digital seahorse) mimmics the same curvature of the missing skin: It's either a bad edit job, or the algorithm failed (again). (think about all the fake Instagram bikini shots of girls, where they all have a curvy backside, but there is often a massive unnatural curvature in the photo/mirror that shows that it's a bad edit job).
4 in close up
To make it more obvious, here (in green) should be... skin. Meat. Man flesh. "The thing that humans have to protect their innards". Instead, this guy here is missing bone, marrow, tissue, nerves, blood, skin and all the other things that make a finger functioning. The only reasonable explanation for this would be that the finger hovers underneath the stone, but that's physically impossible (the existing stonebed, position of the flower, stone in question has portions invisible, other fingers) and wouldn't explain the distortion of the the curved "thingy" next to the missing part.
To me, this looks like a simple photoshop montage gone wrong, since the editor failed to use e.g. Clone Stamp, History Brush (remove portions of a stock item you put on a background) or made a mistake when simply copying (I've created hundreds of artworks (see my Deviantart), I've seen it plenty of times). That... or it's just the algorithm that can't handle reflections.
Morgrain said:
1. Distortion of the skin, position and shade of the bubble doesn't make any sense. The skin is seemingly streching because of the bubble, even though that's ofc impossible. A bubble would float above the skin, and only cause a minor shade on the skin, if photographed from above.
2. see above
3. The left thumb is completely distorted?! The left side of the thumb is obviously botched. The thumb stretches unnaturally around the bubble (that's not what thumbs do).
View attachment 5730753
4. Are we blind? Deploy the garrison! I'm surprised that I have to explain this one. It's obvious. A titanic portion of the right thumb is simply... missing. Gone. Reduced to atoms. As I said, either it's a hilariously bad manual edit job (gone wrong), or the algorithm is broken and can't handle all the reflection underwater.
View attachment 5730747
4 in close up
View attachment 5730755
To make it more obvious, here (in green) should be... skin. Meat. Man flesh. "The thing that humans have to protect their innards". Instead, this guy here is missing bone, marrow, tissue, nerves, blood, skin and all the other things that make a finger functioning.
View attachment 5730757
Click to expand...
Click to collapse
Ok.
I think its only the light that "bends" around the bubbles.
faxteren said:
Ok.
I think its only the light that "bends" around the bubbles.
Click to expand...
Click to collapse
That might explain 1 and 2, but not 3 and most certainly not the missing flesh of 4.
None of them have the metadata saying it's the Pixel 7 pro and the focal length in millimetres. Especially for the he telephoto ones.
I made some math, and even if it's a 5x optical, apparently, it's an equivalent to an 80mm on a full frame camera... Which would meab the new telephoto is around 14mm
In contrast, the Pixel 6 Pro's 4x is 19.00mm or 104mm equivalent in full frame
benleonheart said:
None of them have the metadata saying it's the Pixel 7 pro and the focal length in millimetres.
Click to expand...
Click to collapse
Interesting. Didn't even botcher to check that. So these were most certainly not directly uploaded from a Pixel 7 Pro, but were at least scrubbed of metadata on a PC and uploaded from one, which you only need to do if you
a) want to hide a real DLSR camera shot and try to falsely advertise it as a phone shot (*samsung cough cough cough**)
b) want to protect your privacy and/or
c) try to hide a digital edit job.
Spoiler: links to companies faking phone ad photos
*https://www.businessinsider.com/sam...omote-galaxy-a8-star-camera-2018-12?r=US&IR=T
Samsung caught using another DSLR photo to sell a phone camera | Engadget
No, manufacturers still haven't stopped using DSLR photos to fake phone camera shots.
www.engadget.com
Samsung used my DSLR photo to fake their phone’s “portrait mode”
Earlier this year, Samsung was busted for using stock photos to show off capabilities of Galaxy A8’s camera. And now they did it again – they used a stock image taken with a DSLR to fake the camera’s portrait mode. How do I know this, you may wonder? Well, it’s because Samsung used MY photo […]
www.diyphotography.net
Samsung isn't the only one doing this, btw. The industry loves to fake and lie.
Second example:
Huawei gets caught faking DSLR shots as smartphone pictures in a commercial
Oops
www.theverge.com
Huawei caught passing off DSLR pictures as phone camera samples | Engadget
Huawei doesn't have the best track record when it comes to advertising.
www.engadget.com
I can share a few photos now I have taken over the last couple of weeks on the 7 series now.
wildlime said:
I can share a few photos now I have taken over the last couple of weeks on the 7 series now.
Click to expand...
Click to collapse
Thanks for sharing these!
Have you tried portrait mode, particularly at 2x? There was a problem on the Pixel 6 series that caused ugly, EXTREMELY oversharpened pictures in portrait at 2x zoom. I'm curious if it still happens on the 7 series.
jericho246 said:
Thanks for sharing these!
Have you tried portrait mode, particularly at 2x? There was a problem on the Pixel 6 series that caused ugly, EXTREMELY oversharpened pictures in portrait at 2x zoom. I'm curious if it still happens on the 7 series.
Click to expand...
Click to collapse
I haven't yet really sorry.
hell, my OnePlus 7 takes the same photos. Step it up, Googley!
buschris said:
hell, my OnePlus 7 takes the same photos. Step it up, Googley!
Click to expand...
Click to collapse
No way your cheap imx586 can do that.
jericho246 said:
There was a problem on the Pixel 6 series that caused ugly, EXTREMELY oversharpened pictures in portrait at 2x zoom. I'm curious if it still happens on the 7 series.
Click to expand...
Click to collapse
Well it looks like the difference is minimal at best. Portrait @ 2x still looks bad. I'm disappointed.
(relevant part @ 15:00)
If you want to know more
Go to see this news by Adam Conway
Google Pixel 7 Pro camera review: Top-tier camera flagship that improves upon the Pixel 6 series​
I'll see if I can find my photo of the stars I took, and a couple others. The S22 Ultra is definitely bad. I'm giving this to mom and hopefully the P7 is decent
Nr. 1 at DxOmark!
Google Pixel 7 Pro Camera test : is it the best camera out there ?
From photo to video to bokeh, get to know everything about the latest Google pixel 7 pro camera from Dxomark’s comprehensive test results!
www.dxomark.com

Question Could have sworn there were HDR settings, what happened to them?

Have had a couple of bright daylight shots come out like garbage, low detail low quality smudged detail. so just happened to look in settings and no HDR/+. Maybe I'm getting it mixed up with my previous pixels but could have sworn there were HDR settings in Google camera on the 6a?
Strange. Reddit says...
Google is even more cryptic.
Raw mode be your next and better option. Those will require post processing which is done using a photo app that has an adjustable contrast curve option.
Thanks! Raw wouldn't have helped this shot. looks like needed the stacking that HDR provides. making sure I'm not crazy, the option was there before, no? It's definitely still kicking in in some cases but seems somehow auto / Left up to googles algorithms. Unfortunately that leaves us with a lot of garbage shots
Compared to a Canon pro cam all smartphone cams are a pain to use and limited especially for setup options and speed. I lose a lot of shots because that. I never liked HDR's; a properly exposed shot doesn't need it. Raws give you at least 2 full f-stop exposure and WB correction.
No option to shoot multiple burst exposures at different exposure settings either on smartphones.
Even on my Samsung N10+ the HDR setting when toggled on will decide for you if it will be used. That can and does screw up shots when on... sometimes. No real control
Perhaps Google will update that missing feature soon. Rather sloppy of them. Not near as inept as their lame idea implement forced scoped storage though.
That's a bomb I'm still running Pie and 10 to evade that terror.
The issue is these tiny sensors. The photo stacking/ computational photography helps exponentially. When it doesn't kick in, it can be very bad. You're right RAW can definitely help some but there's only so much it can do. Here's two shots, one when the computational kicked in and the other apparently not. These were taken on the fly (my gf and her kid, not random people), so obviously not great composition just an example of what I mean.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Not available for pixel 4a and above. It is default ON.
damian5000 said:
The issue is these tiny sensors. The photo stacking/ computational photography helps exponentially. When it doesn't kick in, it can be very bad. You're right RAW can definitely help some but there's only so much it can do. Here's two shots, one when the computational kicked in and the other apparently not. These were taken on the fly (my gf and her kid, not random people), so obviously not great composition just an example of what I mean.
View attachment 5770083
View attachment 5770085
Click to expand...
Click to collapse
That's not much difference.
On a raw file (or maybe even a jpeg) you could do that and better using an adjustable contrast curve with settable adjustment points. Takes a bit of practice and a color calibrated monitor.
Wow man, the difference is massive. Look again at the detail in all areas. Detail completely smudged out in the faces, the animal, the plants.
damian5000 said:
Wow man, the difference is massive. Look again at the detail in all areas. Detail completely smudged out in the faces, the animal, the plants.
Click to expand...
Click to collapse
Yeah just saw that. Not sure what the cause is though. The face is messed up. I suspect it's the jpeg processing algorithm.
It's also the animal, the plant life to the right can be seen easily. This is what I'm talking about in regards to the photo stacking, what enables these tiny sensors to get decent IQ.
But also, looking at the exif, the bad shot is at 1/7200 ISO 700, that's insanely dim for these little sensors, I'm assuming the digitally brightened. I'm GUESSING I may have been moving (or they were moving) and Google algorithm decides to snap a "sport" shot rather than have something completely unusable? In the case, if so, also no time for stacking in a "sport" shot. Just a guess.
The other 1/400 ISO 45, though I'm not sure how they calculate the latter with stacking/HDR. Whether it's an average of all shots or what.
damian5000 said:
It's also the animal, the plant life to the right can be seen easily. This is what I'm talking about in regards to the photo stacking, what enables these tiny sensors to get decent IQ.
But also, looking at the exif, the bad shot is at 1/7200 ISO 700, that's insanely dim for these little sensors, I'm assuming the digitally brightened. I'm GUESSING I may have been moving (or they were moving) and Google algorithm decides to snap a "sport" shot rather than have something completely unusable? In the case, if so, also no time for stacking in a "sport" shot. Just a guess.
The other 1/400 ISO 45, though I'm not sure how they calculate the latter with stacking/HDR. Whether it's an average of all shots or what.
Click to expand...
Click to collapse
The cam sensor temperature can also increase the noise floor level. With the lense and sensor size as small as they are it's a wonder they can do this good. The narrow depth of focus doesn't help things either.
Cam shake may also play a role as well; the lighter the cam the less stable the shooting platform. Smartphones have no handholds. Not near as easy to shoot with compared to a pro cam and lense that weighs 5 pounds. The larger lens are easier to shoot because of the added weight.
Smartphone cams are convenient but damn I lose a lot of keepers because of them in shutter lag alone.

Categories

Resources