[OP7] Change video saturation - OnePlus 7 Questions & Answers

I find that all video (and photo) are over-saturated. This is probably something many people prefer, however besides from me disliking the effect I'm also seeing loss of detail because some colors (mainly the primary RGB) are capped at the limit of the gamut. I have set my display to sRGB, so I'm sure this isn't just a matter of displaying the video incorrect.
I would like to adjust the saturation to more realistic levels, but I haven't found such a setting. Also, I have tried a few custom camera-apps, but these seem to use the exact same output from the camera?

i have the same issue.. videos are waaaay to oversaturated

Related

HTC Sense 3.0 Camera

I was testing out the HDR options for this and they do not seem to do much at all. There is a difference, but it is VERY minor. I took a shot without HDR, one with HDR, and then one using the free HDR app, and the HDR app simply blew the HTC HDR mode completely out of the water...
Has anyone else had any experience with this? Is there something I may be missing?
HDR always seemed like something better left to photo editing software to me (like photoshop). Phone camera filters always seem kind of lackluster, but then again, none of us are taking professional shots with a cell camera I assume
On topic though, I wouldn't doubt it being better. Sense stuff isn't exactly always perfection as far as software quality and HDR was probably just an afterthought tied into the camera so they could tick off a notch somewhere on some sheet for features.
Absolutely the same as my experience. I used HDR Camera from the market before this (free), and I really loved its results. When I installed the 3.0 Sense camera and saw the HDR setting, I was happy until I tried it. My theory is that it is NOT taking multiple pictures, but only doing an image adjustment --->> NOT the same thing. HDR Camera doesn't have the issues that some of you described. You do not have to be rock steady and my phone is plenty fast enough to take the three photos within about a second. The image settings are also adjustable (color, etc.).
On the other hand, the panorama mode in the Sense camera is Awesome!
yareally said:
HDR always seemed like something better left to photo editing software to me (like photoshop). Phone camera filters always seem kind of lackluster, but then again, none of us are taking professional shots with a cell camera I assume
On topic though, I wouldn't doubt it being better. Sense stuff isn't exactly always perfection as far as software quality and HDR was probably just an afterthought tied into the camera so they could tick off a notch somewhere on some sheet for features.
Click to expand...
Click to collapse
Software cannot do what HDR does. HDR takes 2 pictures, adjusting the gain on the sensor between low and high. It combines the best of both images to gain detail in low light areas and avoid over exposure in high light areas. Software will not work after the fact because the extra information gained from low/high gain is not present. Unfortunately the HDR option is poorly implemented by HTC. If you try HDR on the Iphone, you can see a drastic difference.
Actually, I use Pro HDR on my Evo, and it takes fantastic pictures in true HDR. It is a touch slower than a pro camera, but it does actually meter the scene and adjust the camera's aperture accordingly. It then allows for adjustments between the composite image and allows for saving the final as well as the source images, meaning you can do further editing elsewhere. Outside of some cropping, I haven't had problems.
With that said, I would be curious to know what the deal is with Sense. What's interesting is if you put it in HDR mode, it shows the little icon in the top right with multiple images...like whoever designed it at least understood how true HDR *should* operate. Then, if you push and release it instead of the regular shutter button, it beeps and you hear the lens go, and the image focuses itself. So...idk. I don't see a noticeable effect in the regular images at all...

Is it possible to make an average smartphone's camera sensor (...)

(...) capture visual data at especially high framerates if very little resolution is needed?
I know it's not exactly Android-related, but I had an interesting idea that requires video capture at framerates of probably several thousand images a second. The resolution can probably be kept at just a few hundreds of pixels, so my question is whether such implementation is possible (at least on some smartphones) hardware-wise?
I'm not really sure how the CMOS sensor and data processing work. I know it scans the pixels in a "rolling shutter" manner, which perhaps can impede feasibility, but my Xiaomi (in auto mode where it decides this value on its own) can capture a single image in daylight at tens-of-thousandth of a second, so the actual minimal scanning time of the entire CMOS sensor seems to be short enough. The question is whether there's a minimum pixel "reset time" between frames that can't allow it to capture a sort of "super fast video" even if a very low resolution is being used? Also, I'm not sure it matters, but the video doesn't really need to be "saved", it just needs to provide a live visual cue for my hypothetical app.
Thanks in advance!

Is there a way to "convert/edit" HDR10 videos recorded on phone to SDR?

you might ask me "why dont you just record in regular SDR and not HDR10". for me, at least on the LG G8, i noticed that if I record in HDR10 the usable dynamic range is SIGNIFICANTLY better. For example, the skies are not blown out, there is rarely anything over-exposed even if I set EV to +1. If I used regular SDR video, things are overblown and looks bad.
The ISSUE with HDR10 is that unless I'm playing that video back on my LG G8 or a compatible HDR10 TV, it looks extremely washed out. I simply want to take advantage of the high dynamic range of the HDR10 feature but be able to allow others to view it easily. Can I do this "in-house" on the phone easily?
I'm not a video guru, however I know i CAN do this if I import this into something like Premier and apply a preset then export the render. But it's just steps that I prefer not to take.
Thanks!
Hello! You first need to convert it, read this post on how to convert HDR10+ videos to SDR (standard dynamic range), so they can be viewed on non-HDR devices with brightful and not washed-out colors as the original:
I can't insert Url so google it "maxvergelli.com how-to-convert-hdr10-videos-to-sdr-for-non-hdr-devices"

Question Samsung A-12 moon photos

I recently purchased an A-12 and want to take photos of the moon but they are very grainy, blurred.
Can anyone advise me me on the correct settings to use ?
Thanks in advance.......
So I watched a YT video on the subject, the guy said when you increase the ISO setting you should increase the shutter speed as well.
When I go into Pro mode I can only see 3 buttons - ISO, WB and another that has a slider from -2 ~ +2, I think it's something to do with brightness.
How do I adjust the shutter speed ?
Thanks in advance.....
In general, as long as you are able to set ISO you don't need to worry about shutter speed, it is adjusted automatically according to the scene brightness. Just set ISO to a smaller value and you'll have low noise and low shutter speeds, which, or course, cause the moving things to blur much more, so either a tripod or 4-axis optical stabilization is compulsory for sharp low-shutter-speed/long-exposure pictures. A12 does not have any optical stabilization at all, so you have to use a tripod. Also I recommend to set the exposure adjustment (it's the +/- slider) to a lower value to bring up details on the moon, otherwise it can look totally white.
uluruman said:
In general, as long as you are able to set ISO you don't need to worry about shutter speed, it is adjusted automatically according to the scene brightness. Just set ISO to a smaller value and you'll have low noise and low shutter speeds, which, or course, cause the moving things to blur much more, so either a tripod or 4-axis optical stabilization is compulsory for sharp low-shutter-speed/long-exposure pictures. A12 does not have any optical stabilization at all, so you have to use a tripod. Also I recommend to set the exposure adjustment (it's the +/- slider) to a lower value to bring up details on the moon, otherwise it can look totally white.
Click to expand...
Click to collapse
Thanks very much uluruman, I will try that......
why you buy a lowcost device for that?

Question Astrophotography time lapse question

Just wondering if there's any way I can get an astrophotography time lapse greater than 1 second? I would love to have 60 seconds, but I know it would probably take 4 hours or something.
Just wondering if this is possible or there's any third party apps that might be able to do this (take a longer exposure than the 4 minutes that astrophotography takes)?
I don't think it is possible, the astro time-lapse is made up from the images used to and then stacked for the astro image itself so you would end up with shed loads of images as well.
Have you tried just using the normal time-lapse option in the video settings?
Exactly, take a normal night video and then slow it down with editing software.
schmeggy929 said:
Exactly, take a normal night video and then slow it down with editing software.
Click to expand...
Click to collapse
The dude is talking about astrophotography and long exposure shots for a reason. What will a "night video" do good? And timelapse is not slowing down the video. lmao
That is my mistake, I totally read his post wrong.
Thing is the astro time laps is made up of the individual shots taken when Astrophotography mode is active so those individual image have been taken at f1.85, if you just did a normal time lapse using the main lens the video will still be at f1.85 and with a bit of post processing it should work.
The other way around it is to just take a night mode photo every 30 seconds for 2 hours using a timer and a Bluetooth remote.
MrBelter said:
Thing is the astro time laps is made up of the individual shots taken when Astrophotography mode is active so those individual image have been taken at f1.85, if you just did a normal time lapse using the main lens the video will still be at f1.85 and with a bit of post processing it should work.
The other way around it is to just take a night mode photo every 30 seconds for 2 hours using a timer and a Bluetooth remote.
Click to expand...
Click to collapse
You're talking about Aperture that is FIXED and completely irrelevant in this case. It's not like you have a variable aperture on the lens so you can adjust it.
What matters in his case is the shutter speed and the exposure time.
And no, normal timelapse WON'T work because the shutter speed will be low (fast) and the phone will try to compensate by pushing the ISO high. You'll end up with very dark scenes and TONS of noise.
And what makes Astro mode very important is the FRAME STACKING. Frame stacking reduces the overall noise and increases the "quality" of the image.
Deadmau-five said:
Just wondering if there's any way I can get an astrophotography time lapse greater than 1 second? I would love to have 60 seconds, but I know it would probably take 4 hours or something.
Just wondering if this is possible or there's any third party apps that might be able to do this (take a longer exposure than the 4 minutes that astrophotography takes)?
Click to expand...
Click to collapse
Not with stock camera.
You can try MotionCam Pro for that. It has a timelapse option where you can set your exposure time even to 15 seconds.
MotionCam is mainly for RAW video recording, but you can do photos and time-lapses. The output is absolutely GREAT. You're working with a RAW VIDEO basically and the quality is not comparable to ANY other app.
I had one Astro timelapse from it but I can't seem to find it now. It's sh**y weather outside now so can't do even a short one. I could do just a daylight one so you can see what quality I'm talking about here.
Uploaded a screenshot of the viewfinder. As you can see on the SS, you can adjust the ISO and shutter speed (among many other things) and do a timelapse.
This is basically taking RAW shots that you can later post process with various editing software like, Davinci Resolve, Adobe Premiere, Vegas, etc...
What you get is a video quality on the level of a DSLR and BETTER because there is no post-processing involved on the phone, it's basically RAW DNG images taken (sequence) that you can export (render) into a video at your QUALITY choice with YOUR post-processing involved.
Here is one sample I shot at and rendered to 4k60 (no color grading, just stock output).
Keep in mind that this is YOUTUBE, the quality of the original video is FAR better.
JohnTheFarm3r said:
You're talking about Aperture that is FIXED and completely irrelevant in this case. It's not like you have a variable aperture on the lens so you can adjust it.
What matters in his case is the shutter speed and the exposure time.
And no, normal timelapse WON'T work because the shutter speed will be low (fast) and the phone will try to compensate by pushing the ISO high. You'll end up with very dark scenes and TONS of noise.
And what makes Astro mode very important is the FRAME STACKING. Frame stacking reduces the overall noise and increases the "quality" of the image.
Click to expand...
Click to collapse
I know the aperture is fixed that's why i said it should work given the astrophotography mode time lapse is made up from the 16 images taken when the mode is active and not once the images have been stacked in to a single image. Given the way you talk you of all people should appreciate just how fast f1.85 is, not a single one of my Canon L lenses is that fast or even comes anywhere close to it.
The OP has nothing to lose by giving it a go before recommending extra software and shooting raw (it is raw BTW if we are getting picky, it isn't an acronym for anything).
MrBelter said:
I know the aperture is fixed that's why i said it should work given the astrophotography mode time lapse is made up from the 16 images taken when the mode is active and not once the images have been stacked in to a single image. Given the way you talk you of all people should appreciate just how fast f1.85 is, not a single one of my Canon L lenses is that fast or even comes anywhere close to it.
The OP has nothing to lose by giving it a go before recommending extra software and shooting raw (it is raw BTW if we are getting picky, it isn't an acronym for anything).
Click to expand...
Click to collapse
Where did I say ANYTHING against the fixed aperture of F1.85? I just said that since it's fixed, it's not relevant to the "settings" he uses since he CAN'T change the aperture value anyway.
It's not about "losing" anything, it's about the technical part of understanding that your recommendation won't work because it doesn't use long exposure shutter speeds or frame stacking.
By NOT using frame stacking, the noise will be horrible and there is little much you can do with post-processing without killing completely the "details" on the photo by suppressing both luma and chroma noise.
Another thing is that regular timelapse doesn't push long exposures...It's just not meant to be used for "astro", that's all.
Erm ok fella but how do you think this was all done before Google and its wonderful computational photography came along?
My point about the aperture is it is very fast so it being fixed is not irrelevant at all given it is the only chance of this even working, the OP may have tried it at 0.5x or 5x where the apertures are much slower, the OP has absolutely nothing to lose by giving it a go, it might be crap, you might end up with only the brightest objects in the sky, you might end up with a noisy mush and yet it might be good fun who knows.
Sadly there is always one person that comes along and stomps on the parade because they know best though isn't there?
MrBelter said:
Erm ok fella but how do you think this was all done before Google and its wonderful computational photography came along?
My point about the aperture is it is very fast so it being fixed is not irrelevant at all given it is the only chance of this even working, the OP may have tried it at 0.5x or 5x where the apertures are much slower, the OP has absolutely nothing to lose by giving it a go, it might be crap, you might end up with only the brightest objects in the sky, you might end up with a noisy mush and yet it might be good fun who knows.
Sadly there is always one person that comes along and stomps on the parade because they know best though isn't there?
Click to expand...
Click to collapse
It was done in a way that results were not even close to what we have today. Why use "outdated" methods when we have these VERY capable devices?
The app I suggested is great and has exactly what is he looking for.
Your logic of "How did we do this before XY time" is equal to "Let's just ride horses instead of cars because that's how we did it before". lmao

Categories

Resources