General Google Pixel 7 Pro Camera Samples - Google Pixel 7 Pro

Google shares camera samples of Pixel 7 and Pixel 7 Pro
Macro Focus on Pixel 7 Pro
18 new items · Album by Alexander Schiffhauer
photos.google.com
Night Sight on Pixel 7 & Pixel 7 Pro
8 new items · Album by Alexander Schiffhauer
photos.google.com
Photo Unblur
16 new items · Album by Isaac Reynolds
photos.google.com
Real Tone on Pixel 7 & Pixel 7 Pro
16 new items · Album by Alexander Schiffhauer
photos.google.com
Super Res Zoom on Pixel 7 Pro
27 new items · Album by Alexander Schiffhauer
photos.google.com
Cinematic Blur
5 new items · Album by Isaac Reynolds
photos.google.com

Looks great!

What happened with the second Macro Focus picture? The one underwater? Look at the left hand, and the thumb of the right hand.
It's either an elaborate joke, or maybe a sign that Google manipulates/edits the photos and forgets about fixing botched edits, or the software algorithm for macro is just broken. The picture has obvious problems, it's "botched" - I'm surprised that nobody over at Google noticed and that they actually released this?
Not to mention that the algorithm tried to create a nail on the underside (!) of the index finger of the left hand, which ofc is also botched/broken, it appears that the algorithm is not capable of detecting up/downside of the hand properly.
Edit: included the picture in question from the above album
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}

Morgrain said:
Wtf happened with the second Macro Focus picture? The one underwater? Look at the left hand, and the thumb of the right hand.
It's either an elaborate joke, or maybe a sign that Google manipulates/edits the photos and forgets about fixing botched edits, or the software algorithm for macro is just broken. The picture has obvious problems, it's "botched" - I'm surprised that nobody over at Google noticed and that they actually released this?
Not to mention that the algorithm tried to create a nail on the underside (!) of the index finger of the left hand, which ofc is also botched/broken, it appears that the algorithm is not capable of detecting up/downside of the hand properly.
Edit: included the picture in question from the above album
Click to expand...
Click to collapse
You do realize that the hands are under water?
Could you mark the places where you think there is something wong?

faxteren said:
You do realize that the hands are under water?
Could you mark the places where you think there is something wong?
Click to expand...
Click to collapse
I guess that's what he is talking about. But I would say that's caused by the sun and the water.

1. Distortion of the skin, position and shade of the bubble doesn't make any sense. The skin is seemingly stretching because of the bubble, even though that's ofc impossible. A bubble would float above the skin, and only cause a minor shade on the skin, if photographed from above.
2. see above
3. The left thumb is completely distorted?! The left side of the thumb is obviously botched. The thumb stretches unnaturally around the bubble (that's not what thumbs do).
4. Are we blind? Deploy the garrison! I'm surprised that I have to explain this one. It's obvious. A titanic portion of the right thumb is simply... missing. Gone. Reduced to atoms. As I said, either it's a hilariously bad manual edit job (gone wrong), or the algorithm is broken and can't handle all the reflection underwater. (also notice how, where the thumb is missing skin, this skintoned/pink-ish "thingy" (looks a bit like a digital seahorse) mimmics the same curvature of the missing skin: It's either a bad edit job, or the algorithm failed (again). (think about all the fake Instagram bikini shots of girls, where they all have a curvy backside, but there is often a massive unnatural curvature in the photo/mirror that shows that it's a bad edit job).
4 in close up
To make it more obvious, here (in green) should be... skin. Meat. Man flesh. "The thing that humans have to protect their innards". Instead, this guy here is missing bone, marrow, tissue, nerves, blood, skin and all the other things that make a finger functioning. The only reasonable explanation for this would be that the finger hovers underneath the stone, but that's physically impossible (the existing stonebed, position of the flower, stone in question has portions invisible, other fingers) and wouldn't explain the distortion of the the curved "thingy" next to the missing part.
To me, this looks like a simple photoshop montage gone wrong, since the editor failed to use e.g. Clone Stamp, History Brush (remove portions of a stock item you put on a background) or made a mistake when simply copying (I've created hundreds of artworks (see my Deviantart), I've seen it plenty of times). That... or it's just the algorithm that can't handle reflections.

Morgrain said:
1. Distortion of the skin, position and shade of the bubble doesn't make any sense. The skin is seemingly streching because of the bubble, even though that's ofc impossible. A bubble would float above the skin, and only cause a minor shade on the skin, if photographed from above.
2. see above
3. The left thumb is completely distorted?! The left side of the thumb is obviously botched. The thumb stretches unnaturally around the bubble (that's not what thumbs do).
View attachment 5730753
4. Are we blind? Deploy the garrison! I'm surprised that I have to explain this one. It's obvious. A titanic portion of the right thumb is simply... missing. Gone. Reduced to atoms. As I said, either it's a hilariously bad manual edit job (gone wrong), or the algorithm is broken and can't handle all the reflection underwater.
View attachment 5730747
4 in close up
View attachment 5730755
To make it more obvious, here (in green) should be... skin. Meat. Man flesh. "The thing that humans have to protect their innards". Instead, this guy here is missing bone, marrow, tissue, nerves, blood, skin and all the other things that make a finger functioning.
View attachment 5730757
Click to expand...
Click to collapse
Ok.
I think its only the light that "bends" around the bubbles.

faxteren said:
Ok.
I think its only the light that "bends" around the bubbles.
Click to expand...
Click to collapse
That might explain 1 and 2, but not 3 and most certainly not the missing flesh of 4.

None of them have the metadata saying it's the Pixel 7 pro and the focal length in millimetres. Especially for the he telephoto ones.
I made some math, and even if it's a 5x optical, apparently, it's an equivalent to an 80mm on a full frame camera... Which would meab the new telephoto is around 14mm
In contrast, the Pixel 6 Pro's 4x is 19.00mm or 104mm equivalent in full frame

benleonheart said:
None of them have the metadata saying it's the Pixel 7 pro and the focal length in millimetres.
Click to expand...
Click to collapse
Interesting. Didn't even botcher to check that. So these were most certainly not directly uploaded from a Pixel 7 Pro, but were at least scrubbed of metadata on a PC and uploaded from one, which you only need to do if you
a) want to hide a real DLSR camera shot and try to falsely advertise it as a phone shot (*samsung cough cough cough**)
b) want to protect your privacy and/or
c) try to hide a digital edit job.
Spoiler: links to companies faking phone ad photos
*https://www.businessinsider.com/sam...omote-galaxy-a8-star-camera-2018-12?r=US&IR=T
Samsung caught using another DSLR photo to sell a phone camera | Engadget
No, manufacturers still haven't stopped using DSLR photos to fake phone camera shots.
www.engadget.com
Samsung used my DSLR photo to fake their phone’s “portrait mode”
Earlier this year, Samsung was busted for using stock photos to show off capabilities of Galaxy A8’s camera. And now they did it again – they used a stock image taken with a DSLR to fake the camera’s portrait mode. How do I know this, you may wonder? Well, it’s because Samsung used MY photo […]
www.diyphotography.net
Samsung isn't the only one doing this, btw. The industry loves to fake and lie.
Second example:
Huawei gets caught faking DSLR shots as smartphone pictures in a commercial
Oops
www.theverge.com
Huawei caught passing off DSLR pictures as phone camera samples | Engadget
Huawei doesn't have the best track record when it comes to advertising.
www.engadget.com

I can share a few photos now I have taken over the last couple of weeks on the 7 series now.

wildlime said:
I can share a few photos now I have taken over the last couple of weeks on the 7 series now.
Click to expand...
Click to collapse
Thanks for sharing these!
Have you tried portrait mode, particularly at 2x? There was a problem on the Pixel 6 series that caused ugly, EXTREMELY oversharpened pictures in portrait at 2x zoom. I'm curious if it still happens on the 7 series.

jericho246 said:
Thanks for sharing these!
Have you tried portrait mode, particularly at 2x? There was a problem on the Pixel 6 series that caused ugly, EXTREMELY oversharpened pictures in portrait at 2x zoom. I'm curious if it still happens on the 7 series.
Click to expand...
Click to collapse
I haven't yet really sorry.

hell, my OnePlus 7 takes the same photos. Step it up, Googley!

buschris said:
hell, my OnePlus 7 takes the same photos. Step it up, Googley!
Click to expand...
Click to collapse
No way your cheap imx586 can do that.

jericho246 said:
There was a problem on the Pixel 6 series that caused ugly, EXTREMELY oversharpened pictures in portrait at 2x zoom. I'm curious if it still happens on the 7 series.
Click to expand...
Click to collapse
Well it looks like the difference is minimal at best. Portrait @ 2x still looks bad. I'm disappointed.
(relevant part @ 15:00)

If you want to know more
Go to see this news by Adam Conway
Google Pixel 7 Pro camera review: Top-tier camera flagship that improves upon the Pixel 6 series​

I'll see if I can find my photo of the stars I took, and a couple others. The S22 Ultra is definitely bad. I'm giving this to mom and hopefully the P7 is decent

Nr. 1 at DxOmark!
Google Pixel 7 Pro Camera test : is it the best camera out there ?
From photo to video to bokeh, get to know everything about the latest Google pixel 7 pro camera from Dxomark’s comprehensive test results!
www.dxomark.com

Related

Better camera app for panoramic shots on HD2?

is there a camera app that has a better panoramic feature than the one that's already built into the HD2? the one that's with the factory camera app doesn't stitch together well. Auto stitching would be great.
Agree on that, "autostitching" should realy be standard. Almost impossible to make a perfect panoramic photo atm...
I really hope someone can make an app with auto stitching cause the HD2 takes beautiful pictures. So having a panoramic feature that actually works right would make it 100x better
Best solution is take individual photos that overlap, transfer to PC, then make panoramas with one of the many progs available for desktop.
I concur. In fact, the best panorama software I had the pleasure to use was (believe it or not) windows live photo gallery.
Here are some examples I had shot with my Canon 720IS...
I challenge you to find any seems, the image was made of 8 images stitched together.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
chvvkumar said:
I concur. In fact, the best panorama software I had the pleasure to use was (believe it or not) windows live photo gallery.
Here are some examples I had shot with my Canon 720IS...
I challenge you to find any seems, the image was made of 8 images stitched together.
Click to expand...
Click to collapse
Oh, that's easy to find seems in that picture
Bottom left, something wrong with the tiles on the floor and the handrail/(other english expression for it).
Also, look at the trees from right to left, you'll notice something on the far right side, colours are different too
But that's a pretty good result for Windows Live Photo Gallery!
I still use Photomerge in Photoshop for making panoramas though.
I haven't seen any camera program on a mobile phone do some good stitching and I don't think they are capable of stitching a XX MPix-Pano together well...
I'd be happy if anyone could prove me wrong on this!
Regards,
Lukas
xILukasIx said:
Oh, that's easy to find seems in that picture
Bottom left, something wrong with the tiles on the floor and the handrail/(other english expression for it).
Also, look at the trees from right to left, you'll notice something on the far right side, colours are different too
But that's a pretty good result for Windows Live Photo Gallery!
I still use Photomerge in Photoshop for making panoramas though.
I haven't seen any camera program on a mobile phone do some good stitching and I don't think they are capable of stitching a XX MPix-Pano together well...
I'd be happy if anyone could prove me wrong on this!
Regards,
Lukas
Click to expand...
Click to collapse
You got me!!! (though, I still can't seem to find the tree anomaly that you speak of )
As for the colour correction, the source images did not go through any post processing, they were straight from the camera (in full manual with preset white balance, exposure compensation, shutter speed, aperture as well as ISO) I guess the colour variation might be the result of the weird angle of the sun
But, I guess that would be 'good enough' for 99% of users here.
Also, if you are using a mobile for creating panoramic images, that would be the least of your problems.
My wifes iPhone has pretty good app. It's called "auto stitch". It actually works pretty well. Oh well i guess if there's no mobile app for the HD2 I guess I will make panaramic images with PhotoShop. Would be nice to just have it done right on the phone.
Are you looking for something that shows you markers whilst you take the pic, or just something that can stitch pics together?
I think one of the problems is HTC's camera driver (which you'd need access to to write such an app) is a bit of a closed cupboard for developers It's easy in code to access the images after they've been taken (so it would be possible to have an app thatt could stitch existing photos), but not easy (impossible?) to access the real-time video feed bit so that you could overlay stitching markers whilst taking the pics. If you do it the way you're meant to in code I think you get a really low-res version of it from what I've read. Probably HTC being lazy (getting it to work for them but not for any developers). Shame as I reckon we'd have a few iphone-esque augmented reality type apps if things were a bit clearer on that front
James
I found a way to get pano pic´s on the HTC HD2!
Just install this cab... i tried to put it to work and it didn´t worked... I uninstalled it and after it keep on working fine...
I tried the above cab on my HD2
and in panarama mode it takes a very light/bright first image then freezes and reboots the phone in about 30 seconds. Just FYI.
Strange, because i having enabled panoramic photo function by my own registry mod and this mode working very good (of curse best results are when we set light, ISO etc. manualy).
Unfortunely panoramic photos maked in this mode having low resolution, so i suggest is better take few normal photos in different sides and join in some piece of software in PC, because this give much better results.

【Huawei Mate20】Huawei AI Photography is amazing

I wonder to know that would you all like to take photoes ? And what is the reason?
If you ask me, I would like use a German proverb to explain: "The only thing that happened once just like they had never happened." The meaning of photoes may be that they can provide evidence of existence for things that have only happened once.However, in most cases, the color in the mobile phone is far from the sight seen in the eyes ,leading to that the pictures can not fully express our state of mind at that time.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
I have tried a lot of mobile phones, but the color performance is not satisfactory. So this time I used the AI camera mode specially set up by the new Huawei mobile phone. After taking a few photos, I found out that it is really perfect .
1. Blue sky
people may take photoes of the blue sky and white clouds when they are traveling or in good mind, However, the color of sky in the mobile phone freqently far from the scenery seen in our eyes . with the AI mode in Huawei emerge , the clarity of the image is significantly increasing , making the sky look more clear and transparent.
HuaweiAI( Picture 1) Note 9( Picture 2) Iphone X( Picture 3)
2. Cute pet
People may take a lot of pictures of their pets to recored the precious time, but it is diffcult to show the real colour of pets, the pictures took by Huawei AI mode look alive, let us see the comparison.
HuaweiAI ( Picture 1) Note 9( Picture 2)
3. Beautiful flowers
Coloueful Flowers are difficult to display in mobile phone, the pictures took by Huawei AI is the most beautiful. It looks like that The flowers are flowers are ready to come out. The leaves are also more green and transparent.
HuaweiAI( Picture 1) Note 9( Picture 2) Iphone X( Picture 3)
4. A beautiful underwater world
The pictures taken under water by mobile phone are generally not very good, the reason is the photos lack color levels, and it is easy to be large gray or blue-green areas. Mobile phone manufacturers have added a color adjustment mechanism to improve the effect of underwater photography. In contrast, the effect of Huawei AI's camera is relatively better. The overall brightness of the underwater picture is improved, the definition is better, and the color is more full.
HuaweiAI ( Picture 1) Sumsung S8 ( Picture 2) Iphone X( Picture 3)
Taking photoes to remember our lifes is wonderful, It is not easy to express our mood clearly from the photos. But useing the Huawei AI camera mode may be a good choice, it can present a more beautiful and clearly world,when you looks back , the colorful and clearly pictures will definitely bring you back to the mood of that moment.
dazzleshining said:
I wonder to know that would you all like to take photoes ? And what is the reason?
If you ask me, I would like use a German proverb to explain: "The only thing that happened once just like they had never happened." The meaning of photoes may be that they can provide evidence of existence for things that have only happened once.However, in most cases, the color in the mobile phone is far from the sight seen in the eyes ,leading to that the pictures can not fully express our state of mind at that time.
I have tried a lot of mobile phones, but the color performance is not satisfactory. So this time I used the AI camera mode specially set up by the new Huawei mobile phone. After taking a few photos, I found out that it is really perfect .
1. Blue sky
people may take photoes of the blue sky and white clouds when they are traveling or in good mind, However, the color of sky in the mobile phone freqently far from the scenery seen in our eyes . with the AI mode in Huawei emerge , the clarity of the image is significantly increasing , making the sky look more clear and transparent.
HuaweiAI( Picture 1) Note 9( Picture 2) Iphone X( Picture 3)
2. Cute pet
People may take a lot of pictures of their pets to recored the precious time, but it is diffcult to show the real colour of pets, the pictures took by Huawei AI mode look alive, let us see the comparison.
HuaweiAI ( Picture 1) Note 9( Picture 2)
3. Beautiful flowers
Coloueful Flowers are difficult to display in mobile phone, the pictures took by Huawei AI is the most beautiful. It looks like that The flowers are flowers are ready to come out. The leaves are also more green and transparent.
HuaweiAI( Picture 1) Note 9( Picture 2) Iphone X( Picture 3)
4. A beautiful underwater world
The pictures taken under water by mobile phone are generally not very good, the reason is the photos lack color levels, and it is easy to be large gray or blue-green areas. Mobile phone manufacturers have added a color adjustment mechanism to improve the effect of underwater photography. In contrast, the effect of Huawei AI's camera is relatively better. The overall brightness of the underwater picture is improved, the definition is better, and the color is more full.
HuaweiAI ( Picture 1) Sumsung S8 ( Picture 2) Iphone X( Picture 3)
Taking photoes to remember our lifes is wonderful, It is not easy to express our mood clearly from the photos. But useing the Huawei AI camera mode may be a good choice, it can present a more beautiful and clearly world,when you looks back , the colorful and clearly pictures will definitely bring you back to the mood of that moment.
Click to expand...
Click to collapse
The Cat body is blurred, I prefer the other one.
Enviado desde mi EVR-L29 mediante Tapatalk
Kinda preaching to the choir here friend. We know the M20s are great but so are the others too.

VR headset?

Hi!
I was wondering if vr headset would have a good quality with this phone, and which ones are compatible?
If you have one or know one that could fit my poco f2 pro, please share your references below
Well,
As nobody has replied yet, I ordered a very cheap VR Headset. The phones seems to fit in there, I'll tell you more when I get it.
The headset can be found around 10€ instead of 60€ on amazon (I'm french ) and has good reviews. It's the Freefly VR beyond, and I know it looks kinda... weird.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
EDIT: Please do not buy this one! I know it is cheap, and I managed to use it for a few hours, but the optics can't be adjusted so... You'll never have a proper image with it if you haven't the perfect IPD, and your eyes won't be able to properly focus...
I ordered an other model since last time which is this from the company Shinecon (re-branded as Hamswan) The optics seems very good (no Fresnel lenses), and you can adjust easily the IPD (interpupillary distance) and independent focus for each eye.
The only thing not practical on the one I chose is that there is no button to touch the screen and select things. Also the headset nose emplacement is a little weird and can hurt a little if you haven't a small nose. But overall the optics seems good and I think you could buy from this company, just you should choose another model with a button.
Here are some pictures from the one I got :
And the weird nose emplacement ^^' :
By the way you can place very big phones in this headset.
So I think you can choose a more recent option from Shinecon (or rebranded Shinecon) WITH A BUTTON and maybe a different kind of lenses adjustments as they are kinda sensitive (and also a better shaped nose ^^')
Quality effort, quality content.
Well done mate!
how do you use this and with what? It's possible to use it as an oculus to cast the screen of the pc?
Did someone figured out how to deal with VR on that phone?
I'm generally not using the VR for gaming, but for playing 180/360 degree videos (usually in SBS format). For few years I was using the ZTE Nubia Z7 Max with VR Box 2.0 goggles and AAA VR Cinema app for that with just the perfect result. The effect was always stunning, everything looks pretty close to the real with a nice 3D effect of the items (in some apps the image looks more flat, where that effect is not visible, but with that app mentioned above the result was very nice), especially if the video file resolution is good enough. Unfortunately, the Nubia had problem with playing videos in vr in 4k resolution and higher. Doing some experiments I've figured out, that the resolution 3684x1842 (which I'm calling 3,7K ) in H264 is max to provide a stable playback in 60 fps. The Nubia unfortunately has too weak components for something more. But anyway, the image quality is very good in that configuration.
Moving on to the POCO F2 Pro, I've decided for phone upgrade in recent days and was hoping for an absolutely stunning effect with the Super AMOLED screen. Unfortunately, the result is a nightmare as for now. The POCO fits quite well in the VR Box 2.0 Goggles, but that's the only one good aspect. Unfortunately, on the MIUI12 the AAA VR Cinema app have a problem with opening in the not-fullscreen mode. It could work only in a fullscreen mode, but everything is then stretched and the big part of the image is cut out! It's impossible to adjust the goggles with that image - you could only get a squint from that. I've tried a big bunch of the free apps for VR available on the play store and most of them simply doesn't work on POCO (can't read files, generate improper image for which is impossible to adjust the goggles or are simply a joke). The most promising is VRTV Free, but it's quite hard to adjust it, and the 3D effect is not quite visible as was in the AAA VR Cinema app on Nubia - everything looks more flat than 3D. I was also hoping for a stunning image quality on the Super AMOLED screen and you know what? On POCO, when you put out it into VR goggles there are noticeable, quite visible black gaps (a black holes, hehe) between the pixels, so in my opinion the image quality is a wayy much better on the Nubia despite its the IPS! Probably on the Nubia Z7 (NX506J), which had the 1440x2560 screen resolution it would be even better. About the POCO, I'm disappointed about the image quality of that Super AMOLED in VR. In normal it looks better on POCO than in the Nubia, but in the VR for me the Nubia clearly wins - it only has a little too weak CPU&GPU.
So what to do about that now? The VR is very important for me. Do you know the apps (could be the paid) for VR which are working good in POCO? And the same for the Googles? Cuz maybe the VR Box 2.0 Goggles are the problem too, because despite the fact that the POCO fits for them well physically, they are designed for phones from 3,5"-6" according to the specs.
EDIT:
Yeah, it looks like I've figured out some solution! Don't know why I haven't thought about that before, but I just came up with the idea of trying some earlier versions of the AAA VR Cinema. And yeah, on the 1.5.3 version I'm able to start the app in a non-fullscreen mode Buttons & etc. aren't stretched and the whole image fits the screen. And the image quality in the VR Box 2.0 (exactly it's an Esperanza model) is very nice! Maybe even better than on Nubia, because now the gaps between the pixels are hardly visible and are comparable to that on Nubia. But the performance gain is flawless and now I finally can play HEVC videos (had to use AVC before on Nubia). For me the AAA VR Cinema app is the best one out of the free apps (haven't tested the paid solutions - does someone know any worth trying?) - it has that stunning 3D effect due of what everything looks pretty realistic, which any other app doesn't have. That version of the app also in the fullscreen mode have the proper screen size, so it's not cut off and perfectly fits the screen, but in the fullscreen I have problems with adjusting the googles (maybe here other goggles will help). As for now I'm satisfied of using that setup on a non-fullscreen mode. And there's an additional problem on the POCO - the image is shifted to the one side of the screen:
so you can't properly place the phone inside the VR Box 2.0 goggles with the center line matching the notch, because there are not enough space in the tray to move the phone to the side. It's causing that the view is shifted a little and you can feel like you're not exactly in the center of the action. But the image is looking properly anyway. I've found that its caused by the system navigation buttons (because of some space reserved for them at the side of the screen). After changing the steering in the system options from buttons to the fullscreen gestures everything is allright!
There is also the option to cut a little piece of the plastic of the tray on the side, so the phone could be moved a lil more to the side. Or use other googles. In the meantime I've tested the googles from shinecon (the Box 5 mini model) which doesn't have that problem because they don't have the tray and they have the optics more comfortable to setup (but also not fully independent), end the lenses are probably better than on VR Box 2.0 so the image is more bright (at that's maybe to the plastic color which is white here?), but personally I prefer the lenses from the VR Box 2.0. They have a little blurry image (just a little) so the pixels shape isn't that noticeable. On shinecon I think everything is more sharp. I would say it's like a kind of antialiasing in the VR Box 2.0 So I prefer them, because they also are av little bigger, have the skin-like binding around the contract area (on shinecon it's just a sponge...) And the tray is much more easier to use. But they have worse ventilation s due to that, on the Nubia I've sometimes was thinking that the phone will explode due to the heat. But the POCO isn't overheating so it's not a problem.
But there one more little problem I've noticed on POCO - even on the latest version of the app - it has the tendency to crash the app on start after one use, do you can't stay it. The solution for that is to go into the information about the app and enable and right after disable the fullscreen mode for it, and when you start the app hold the phone vertically, not horizontally, after the successful start you can change to the landscape. So there are a little problems but finally everything is able to figure out, with some additional "ritual" to do but it works. I hope that maybe that informations would be helpful for someone
I will also willingly test any other solutions, if someone would come with them. Also it would be nice to know your propositions about the apps to use (free or paid), but as for now the AAA VR Cinema is for me just the best and probably only one which could generate the proper, stunning like realistic image (the effect is like on the YouTube or even better). It has some disadvantages (like no timeline of the video so you can forward or rewind with some limitations, and the control buttons aren't hiding but can only be always visible or always off, but that are the things capable to living with them, and for the image quality I could forgive that.
For me, its not worth it. Im using zeiss vr one (glass optic) but the pixel are very visible. Movie are ok i guess but not games.
It's because of the physical pixel density of the screen, probably. The screen diameter is large and the native resoluition is still only FHD+. Probably, with the higher resolution it would be better. That's probably why the overal impression for me was better on Nubia than on the POCO (almost same resolution, but smaller screen). I rather do not play games, but for the movies I've also bought the Fiit VR 2N goggles in the meantime and with the correct qr code setting it looks for me noticeably better for the pixel look. Maybe with games it would be better too.
I'm also wondering how it would work with the Baofeng Moijng S1.
@Oswald Boelcke
Shouldn't this be moved as well then?

Question S22 camera [Suspicious moon photos]

Hey gang,
I would love a developer or anyone in the know to investigate this potential bull**** moon filter that kicks in automatically when you zoom in on the moon. This first came to my attention on the Note 9. Basically it looks like the camera app is detecting the moon phase, then after a bright flash it loads the appropriate phase image.
While zooming in, the light and fuzziness just goes away and you have this clear image and no light, no stars or clouds, just this prefect moon shot. Tonight o tested through a dirty window and I got the same clear photo. I took two and noticed they are different hues between the shots. So.... anyone already know what's up or care to investigate? Maybe there's some moon phases photos in a directory somewhere.
If you need anything from me please hit me up.
Thank you
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
I remember reading posts from Samsung users claiming the moon pics from Samsung phones are very much real, unlike the ones from Huawei and other Chinese brands.
I'm not an expert on the matter because I never cared much about this stuff, personally I always thought all brands use some sort of AI for the moon shots.

			
				
The moon is super bright, when I focus my DSLR with a zoom lens on the moon I can't see any stars as well. To photograph the moon the camera uses daylight shutter settings.
ryant35 said:
The moon is super bright, when I focus my DSLR with a zoom lens on the moon I can't see any stars as well. To photograph the moon the camera uses daylight shutter settings.
Click to expand...
Click to collapse
Oh ****, on a DSLR too huh? Maybe this is just how it works then. Thank you
JAH0707 said:
Oh ****, on a DSLR too huh? Maybe this is just how it works then. Thank you
Click to expand...
Click to collapse
At night, the moon is the brightest object in the sky. The human eye can compensate for the difference which allows us to see stars and moon at the same time. Most DSLR cameras and all phone cameras are not sophisticated enough to allow both stars and the moon to be visible. Pretty much every moon shot where you see both the moon and the stars are composite shots. Where the photographer exposed for the moon then exposed for the stars. Then take both shots and overlay one on the other.
There was an article written when the s21u was first released and they compared the s21u moon shots to shots taken using a DSLR of the same time. They found that the s21u moon shots were real and there was no overlay used.
JAH0707 said:
Hey gang,
I would love a developer or anyone in the know to investigate this potential bull**** moon filter that kicks in automatically when you zoom in on the moon. This first came to my attention on the Note 9. Basically it looks like the camera app is detecting the moon phase, then after a bright flash it loads the appropriate phase image.
While zooming in, the light and fuzziness just goes away and you have this clear image and no light, no stars or clouds, just this prefect moon shot. Tonight o tested through a dirty window and I got the same clear photo. I took two and noticed they are different hues between the shots. So.... anyone already know what's up or care to investigate? Maybe there's some moon phases photos in a directory somewhere.
If you need anything from me please hit me up.
Thank you
View attachment 5658259 View attachment 5658261 View attachment 5658263 View attachment 5658265
Click to expand...
Click to collapse
Photographing stars with good detail takes about 20-30 seconds. A telephoto pic of the moon takes only a fraction of a second since it's so bright so you won't see much of any star detail in the background.
Funny I just took a couple photos the other day and thought the same, then I saw this post
Huawei fakes it for sure and OP's theory is spot on for Huawei.
Samsung - Well, I didnt expect that from Samsung,
JazonX said:
Huawei fakes it for sure and OP's theory is spot on for Huawei.
Samsung - Well, I didnt expect that from Samsung,
Click to expand...
Click to collapse
Samsung isn't faking it. Here is a shot I took a few weeks ago. I had to do some Lightroom work.
I've taken enough pictures of the moon to know.
If you really need to go down the rabbit hole, here is the article that confirms Samsung is not faking it unlike Huawei. - https://www.gsmarena.com/myth_debunked_samsungs_100x_zoom_doesnt_fake_moon_photos-news-47487.php
A few months late, but there is a video of someone comparing the S22's telephoto capabilities with those of the Nikon P1000 and he did confirm that the S22 applies a filter to the moon with the 30x zoom. It's pretty obvious once you see the difference side by side. If you watch the video you'll see it relatively clearly. (Just look for S22 vs P1000 on YouTube, creator is named Versus, skip to 5 minutes)
The best method to test it out for yourself is to take a photo of the moon through some treebranches. Some people tried this and got the filter to only partially cover the moon this way. 2 of the pictures posted here also have this smooth artificial look, but it seems like the AI got the colors right.
Lunar Ellipse take 1.5 months ago in Tokyo
Took this last night and did a remaster on it.. noticed a square like the moon was pasted on top of the black background... looks faked to me... 
The original picture and then remastered.
Quite possibly the reason it's "debunked" is because people that are trying to debunk it are taking pictures not pointing in the direction of the moon.. the phone does have sensors in it which let it know what direction it's facing, where its looking and so on.. so if you're taking pictures of a ping pong ball in your house... the phone knows you aren't pointing it in the actual direction of the moon..
This is interesting, this would be an awesome conspiracy. It wouldn't be too hard to debunk: find a small circular object that would match the size of the moon and brightness (use an led or a pen light), use full zoom mode, aim the phone exactly where the moon should be.
Shot some days ago with my 22U/Exynos (edited with Lightroom on Android)
Just stepped out of the door, seen the moon, seen where the plane was flying, and took my chance
phatmanxxl, could probably spoof the phones sensors to trick it into thinking it's looking at the moon.. not sure what sensors would need to be spoofed but I'm guessing gps would be one of them.

Question Could have sworn there were HDR settings, what happened to them?

Have had a couple of bright daylight shots come out like garbage, low detail low quality smudged detail. so just happened to look in settings and no HDR/+. Maybe I'm getting it mixed up with my previous pixels but could have sworn there were HDR settings in Google camera on the 6a?
Strange. Reddit says...
Google is even more cryptic.
Raw mode be your next and better option. Those will require post processing which is done using a photo app that has an adjustable contrast curve option.
Thanks! Raw wouldn't have helped this shot. looks like needed the stacking that HDR provides. making sure I'm not crazy, the option was there before, no? It's definitely still kicking in in some cases but seems somehow auto / Left up to googles algorithms. Unfortunately that leaves us with a lot of garbage shots
Compared to a Canon pro cam all smartphone cams are a pain to use and limited especially for setup options and speed. I lose a lot of shots because that. I never liked HDR's; a properly exposed shot doesn't need it. Raws give you at least 2 full f-stop exposure and WB correction.
No option to shoot multiple burst exposures at different exposure settings either on smartphones.
Even on my Samsung N10+ the HDR setting when toggled on will decide for you if it will be used. That can and does screw up shots when on... sometimes. No real control
Perhaps Google will update that missing feature soon. Rather sloppy of them. Not near as inept as their lame idea implement forced scoped storage though.
That's a bomb I'm still running Pie and 10 to evade that terror.
The issue is these tiny sensors. The photo stacking/ computational photography helps exponentially. When it doesn't kick in, it can be very bad. You're right RAW can definitely help some but there's only so much it can do. Here's two shots, one when the computational kicked in and the other apparently not. These were taken on the fly (my gf and her kid, not random people), so obviously not great composition just an example of what I mean.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Not available for pixel 4a and above. It is default ON.
damian5000 said:
The issue is these tiny sensors. The photo stacking/ computational photography helps exponentially. When it doesn't kick in, it can be very bad. You're right RAW can definitely help some but there's only so much it can do. Here's two shots, one when the computational kicked in and the other apparently not. These were taken on the fly (my gf and her kid, not random people), so obviously not great composition just an example of what I mean.
View attachment 5770083
View attachment 5770085
Click to expand...
Click to collapse
That's not much difference.
On a raw file (or maybe even a jpeg) you could do that and better using an adjustable contrast curve with settable adjustment points. Takes a bit of practice and a color calibrated monitor.
Wow man, the difference is massive. Look again at the detail in all areas. Detail completely smudged out in the faces, the animal, the plants.
damian5000 said:
Wow man, the difference is massive. Look again at the detail in all areas. Detail completely smudged out in the faces, the animal, the plants.
Click to expand...
Click to collapse
Yeah just saw that. Not sure what the cause is though. The face is messed up. I suspect it's the jpeg processing algorithm.
It's also the animal, the plant life to the right can be seen easily. This is what I'm talking about in regards to the photo stacking, what enables these tiny sensors to get decent IQ.
But also, looking at the exif, the bad shot is at 1/7200 ISO 700, that's insanely dim for these little sensors, I'm assuming the digitally brightened. I'm GUESSING I may have been moving (or they were moving) and Google algorithm decides to snap a "sport" shot rather than have something completely unusable? In the case, if so, also no time for stacking in a "sport" shot. Just a guess.
The other 1/400 ISO 45, though I'm not sure how they calculate the latter with stacking/HDR. Whether it's an average of all shots or what.
damian5000 said:
It's also the animal, the plant life to the right can be seen easily. This is what I'm talking about in regards to the photo stacking, what enables these tiny sensors to get decent IQ.
But also, looking at the exif, the bad shot is at 1/7200 ISO 700, that's insanely dim for these little sensors, I'm assuming the digitally brightened. I'm GUESSING I may have been moving (or they were moving) and Google algorithm decides to snap a "sport" shot rather than have something completely unusable? In the case, if so, also no time for stacking in a "sport" shot. Just a guess.
The other 1/400 ISO 45, though I'm not sure how they calculate the latter with stacking/HDR. Whether it's an average of all shots or what.
Click to expand...
Click to collapse
The cam sensor temperature can also increase the noise floor level. With the lense and sensor size as small as they are it's a wonder they can do this good. The narrow depth of focus doesn't help things either.
Cam shake may also play a role as well; the lighter the cam the less stable the shooting platform. Smartphones have no handholds. Not near as easy to shoot with compared to a pro cam and lense that weighs 5 pounds. The larger lens are easier to shoot because of the added weight.
Smartphone cams are convenient but damn I lose a lot of keepers because of them in shutter lag alone.

Categories

Resources