HTC Sense 3.0 Camera - Thunderbolt General

I was testing out the HDR options for this and they do not seem to do much at all. There is a difference, but it is VERY minor. I took a shot without HDR, one with HDR, and then one using the free HDR app, and the HDR app simply blew the HTC HDR mode completely out of the water...
Has anyone else had any experience with this? Is there something I may be missing?

HDR always seemed like something better left to photo editing software to me (like photoshop). Phone camera filters always seem kind of lackluster, but then again, none of us are taking professional shots with a cell camera I assume
On topic though, I wouldn't doubt it being better. Sense stuff isn't exactly always perfection as far as software quality and HDR was probably just an afterthought tied into the camera so they could tick off a notch somewhere on some sheet for features.

Absolutely the same as my experience. I used HDR Camera from the market before this (free), and I really loved its results. When I installed the 3.0 Sense camera and saw the HDR setting, I was happy until I tried it. My theory is that it is NOT taking multiple pictures, but only doing an image adjustment --->> NOT the same thing. HDR Camera doesn't have the issues that some of you described. You do not have to be rock steady and my phone is plenty fast enough to take the three photos within about a second. The image settings are also adjustable (color, etc.).
On the other hand, the panorama mode in the Sense camera is Awesome!

yareally said:
HDR always seemed like something better left to photo editing software to me (like photoshop). Phone camera filters always seem kind of lackluster, but then again, none of us are taking professional shots with a cell camera I assume
On topic though, I wouldn't doubt it being better. Sense stuff isn't exactly always perfection as far as software quality and HDR was probably just an afterthought tied into the camera so they could tick off a notch somewhere on some sheet for features.
Click to expand...
Click to collapse
Software cannot do what HDR does. HDR takes 2 pictures, adjusting the gain on the sensor between low and high. It combines the best of both images to gain detail in low light areas and avoid over exposure in high light areas. Software will not work after the fact because the extra information gained from low/high gain is not present. Unfortunately the HDR option is poorly implemented by HTC. If you try HDR on the Iphone, you can see a drastic difference.

Actually, I use Pro HDR on my Evo, and it takes fantastic pictures in true HDR. It is a touch slower than a pro camera, but it does actually meter the scene and adjust the camera's aperture accordingly. It then allows for adjustments between the composite image and allows for saving the final as well as the source images, meaning you can do further editing elsewhere. Outside of some cropping, I haven't had problems.
With that said, I would be curious to know what the deal is with Sense. What's interesting is if you put it in HDR mode, it shows the little icon in the top right with multiple images...like whoever designed it at least understood how true HDR *should* operate. Then, if you push and release it instead of the regular shutter button, it beeps and you hear the lens go, and the image focuses itself. So...idk. I don't see a noticeable effect in the regular images at all...

Related

[Q] 3rd party camera apps vs. stock

What's your experience on 3rd party camera apps such as Snap Camera, A better camera, Camera Zoom FX etc. in comparison to stock google cam app? Do they perform any better in point-and-shoot scenarios? Some of these apps recently got support for camera2 API on Nexus 5. Does that actually change anything except RAW support and 4k video recording?
I use "A Better Camera", and shoot in "Super" mode. I took several pictures with the google camera in various levels of light, on normal and HDR, and took the same with A Better Camera using normal, HDR, and Super. Super looks hands down the best. When zooming the pictures in all of the way, you can easily see that A Better Camera's pictures taken as Super contain the most detail and most accurate colors. A Better Camera's HDR also beats the stock camera HDR in detail. For low light A Better Camera destroys the Google Camera.
Edit: Also, A Better Camera focuses better than the stock camera, and doesn't need to refocus every several seconds.
I do not use any post processing.
DeathKoil said:
I use "A Better Camera", and shoot in "Super" mode. I took several pictures with the google camera in various levels of light, on normal and HDR, and took the same with A Better Camera using normal, HDR, and Super. Super looks hands down the best. When zooming the pictures in all of the way, you can easily see that A Better Camera's pictures taken as Super contain the most detail and most accurate colors. A Better Camera's HDR also beats the stock camera HDR in detail. For low light A Better Camera destroys the Google Camera.
Edit: Also, A Better Camera focuses better than the stock camera, and doesn't need to refocus every several seconds.
I do not use any post processing.
Click to expand...
Click to collapse
Just have it a shot and your right!! Thanks for the suggestion!
Sent from my Nexus 7 using XDA Free mobile app
I installed it and it certainly takes a better picture, but I cannot find any "super" setting.
Where is the super setting?
You need premium
Also, do I need premium for it to zoom? I can't get it to zoom like the regular camera does.
You shouldn't need premium for the Super setting. Tap the circle at bottom right to get shooting mode settings, and Super should be the tile in the middle. For zoom, swipe down from the top to get the settings menu, tap More settings (at bottom), tap Viewfinder settings, tap More, and make sure Show zoom control is checked. Then, when you go back to the main (picture-taking) screen, swipe from left to right to bring the zoom slider out.
Thank you for the clear explanations.
I found the zoom control hidden on left. I had been trying to zoom using 2 finger stretch like in other apps but didn't work here.
But super is not there. I purchased the app but as you see in the screenshot there is no super.....
Odd... yours has Night where mine has Super (2nd screenshot). Super was there for me in the free version too.
Well, at least I'm not going blind.
I have the latest version. Just purchased it.
I thought I edited my post to say I'd figured it out, but something didn't save... anyway, if you are running Lollipop, pull down your settings menu, tap More settings, tap General Settings, tap More, then make sure Use Camera2 Interface is checked. Then you will have the Super option available. (If you're still running KitKat, that won't work since KK doesn't use the Camera2 interface.)
Thank you.
I'm still on kitkat. So no super mode. ?

Take better low pictures using google camera and HDR mode

Hi Guys,
I wanted to let you all know that I happened to install Google camera as I was not particularly impressed with the noisy low light photos of the stock camera app.
I was pleasantly surprised to see the results using the google camera after enabling the HDR mode ,the low lights are vastly better than the stock camera in HDR mode, also photos in HDR mode with the google camera are saved much faster than the stock camera in the HDR mode.
Please note that google camera doesn’t show up for me in play store ,so I have side loaded 3.1.025 version .it’s not all rosy though, as the video mode is not working for me and there are occasional FC’s while the pic has been clicked and if you press the home button.
But despite all these, I honestly can’t go back to the stock camera for low light photos,I would suggest you all give it a try and let me know your thoughts.
Cheers.
Attaching some low light picture taken using the google camera.dont have the stock app photos side by side for comparison.sorry.
I have found low light with stock RAW very good as you can control noise levels and lights, even the simple gallery app DNG post treatment make very good results
vegetaleb said:
I have found low light with stock RAW very good as you can control noise levels and lights, even the simple gallery app DNG post treatment make very good results
Click to expand...
Click to collapse
Thanks for the Tip mate ,would try it out.I am inclined towards google camera because without any post processing the images have lesser noise compared to the stock camera.I would try to post some better pics,I will admit the ones I have put dont make a strong case for themselves because I was just trying them out
Cheers.

The image quality of the camera is a LOT worse than advertised

Hello there, other S10e owners,
Recently, I took advantage of an early Black Friday sale to get an S10e, the phone I was craving for a while. They said the image quality is on par with the Pixel 3 after the May update, but in reality, it's just as noisy and "oil-paintingish" as my Xperia Z1 Compact from four (!!!) years ago.
The samples are in the attachment of this forum post; the first and the third image was taken with the Google Camera app, and the second and forth one with the Samsung Camera app. The first two images were taken with HDR, but the latter ones without HDR.
As you can see in the non-HDR pictures, the Samsung Camera picture has less noise, but cold, washed-out colours and less detail thanks to the aggressive noise cancellation.
On the other hand, the Google Camera image has the correct colour representation and more detail, but with the sacrifice of noise cancellation.
With HDR+, though... Things get even worse. The Samsung Camera app now blurs out even more detail and the Google Camera app just cranks up the ISO to levels that make the image too bright.
In conclusion, the image quality is not on the promised levels, but I hope there's something I can do with it (like a good setup.xml file that can fix GCam to shoot exceptionally good images in both sun- and low light or fixing the Samsung Camera app). So, in order to fix this, I need your help. Please give me advice to pimp back up the image quality to the levels I saw in camera reviews.
Edit: Oops, I forgot to mention that I have the Exynos version.
ThePS4Gamer said:
Hello there, other S10e owners,
Recently, I took advantage of an early Black Friday sale to get an S10e, the phone I was craving for a while. They said the image quality is on par with the Pixel 3 after the May update, but in reality, it's just as noisy and "oil-paintingish" as my Xperia Z1 Compact from four (!!!) years ago.
The samples are in the attachment of this forum post; the first and the third image was taken with the Google Camera app, and the second and forth one with the Samsung Camera app. The first two images were taken with HDR, but the latter ones without HDR.
As you can see in the non-HDR pictures, the Samsung Camera picture has less noise, but cold, washed-out colours and less detail thanks to the aggressive noise cancellation.
On the other hand, the Google Camera image has the correct colour representation and more detail, but with the sacrifice of noise cancellation.
With HDR+, though... Things get even worse. The Samsung Camera app now blurs out even more detail and the Google Camera app just cranks up the ISO to levels that make the image too bright.
In conclusion, the image quality is not on the promised levels, but I hope there's something I can do with it (like a good setup.xml file that can fix GCam to shoot exceptionally good images in both sun- and low light or fixing the Samsung Camera app). So, in order to fix this, I need your help. Please give me advice to pimp back up the image quality to the levels I saw in camera reviews.
Edit: Oops, I forgot to mention that I have the Exynos version.
Click to expand...
Click to collapse
Please give me the Gcam download link. I couldn't find any Gcam version that worked well on my S10e Exynos. Thanks
Julyh0rse.ManU said:
Please give me the Gcam download link. I couldn't find any Gcam version that worked well on my S10e Exynos. Thanks
Click to expand...
Click to collapse
If you can do something with it, here's the link: GCam Exynos APK
In sunlight, it should theoretically work with the stock settings, but they say it's recommended to change the auto-exposure correction to 1/2 sec, the Night Sight correction to 1/4 sec and the exposure compensation to -3,0 to get the best out of daytime GCam photos.
For Night Sight images, it's recommended to use the mackytravel-nightsight.xml file without any changes for outdoor night photos and for indoors, turn off auto-exposure correction and exposure compensation, change the Night Sight correction to 1
or 2 sec (depending on the level of darkness) and in the Advanced drop-down menu, turn off the ISO limit. This way, you can get more detail out from the S10e's powerful camera sensors.
Lastly, if you'd like to shoot something with the wide-angle lens, just change the auxiliary camera switching method to the long-press method.
You can find my GCam config folder with all of the said configuration files at the link here.
Just install the GCam apk, copy-paste the config folder to the root of your internal storage (/storage/emulated/0/) and double tap on the black area around the shutter button to choose between the three configuration files.
I've noticed a similar thing with my camera on my SD 855 model where images taken using the camera just don't look very natural or realistic. They look very much like a painting!
Currently I get better results with HDR on and scene optimiser turned off.
My iPhone 7 photos still look a bit more natural at times but obviously less detailed.
Sent from my SM-G970F using Tapatalk

Can an additional camera sensor attached via USB-C be read by apps via current APIs?

ok... so totally crazy idea. Hypothetically speaking, what if there is a way to attach a much larger camera sensor (eg. APS-C size with e-mount) onto a fast SD865 phone (or future) via usb-c? And then would it be possible to have camera apps read data from it via current APIs? obviously there are alot of steps i'm missing here, but the biggest weakness in phone cameras is the sensor and there is simply no physical way to put an APS-C sized one in. The lenses would be humongous.
That said, at SOME times, especially those who are more serious into photography, being able to attach your phone to a big sensor would give you superior gear than anything that exist right now. combining the existing computation techiques with a fast processor WITH a large sensor does not exist. it's one or the other, no one has tried to do both yet (there are some old ones like Samsung Galaxy NX and currently Zeiss, but it doesnt seem like they are going to take advantage of computational tech)
Benefits:
- superior HDR with AI (most DSLRs have multi-stacked HDR but they are not as advanced as Google's)
- potentially AI HDR in video footage (no DSLRs have this , done in post)
- Enhanced artificial bokeh on top of already good bokeh to simulate medium format look
- immediate access to mobile lightroom / sharing direct to sources
- all media creation/library in one source
- utilize superior EIS to have stable footage (again, no DSLR has any good EIS tech. more focus in IBIS and OIS which is beneficial only in photos). ever tried S20 or iphone 11 pro at night? it's a noise party. In this case it would be a clean 4k footage with gimbal like EIS
it's true alot of the above can be done in POST when shooting with large sensor cameras
thoughts?

Question Astrophotography time lapse question

Just wondering if there's any way I can get an astrophotography time lapse greater than 1 second? I would love to have 60 seconds, but I know it would probably take 4 hours or something.
Just wondering if this is possible or there's any third party apps that might be able to do this (take a longer exposure than the 4 minutes that astrophotography takes)?
I don't think it is possible, the astro time-lapse is made up from the images used to and then stacked for the astro image itself so you would end up with shed loads of images as well.
Have you tried just using the normal time-lapse option in the video settings?
Exactly, take a normal night video and then slow it down with editing software.
schmeggy929 said:
Exactly, take a normal night video and then slow it down with editing software.
Click to expand...
Click to collapse
The dude is talking about astrophotography and long exposure shots for a reason. What will a "night video" do good? And timelapse is not slowing down the video. lmao
That is my mistake, I totally read his post wrong.
Thing is the astro time laps is made up of the individual shots taken when Astrophotography mode is active so those individual image have been taken at f1.85, if you just did a normal time lapse using the main lens the video will still be at f1.85 and with a bit of post processing it should work.
The other way around it is to just take a night mode photo every 30 seconds for 2 hours using a timer and a Bluetooth remote.
MrBelter said:
Thing is the astro time laps is made up of the individual shots taken when Astrophotography mode is active so those individual image have been taken at f1.85, if you just did a normal time lapse using the main lens the video will still be at f1.85 and with a bit of post processing it should work.
The other way around it is to just take a night mode photo every 30 seconds for 2 hours using a timer and a Bluetooth remote.
Click to expand...
Click to collapse
You're talking about Aperture that is FIXED and completely irrelevant in this case. It's not like you have a variable aperture on the lens so you can adjust it.
What matters in his case is the shutter speed and the exposure time.
And no, normal timelapse WON'T work because the shutter speed will be low (fast) and the phone will try to compensate by pushing the ISO high. You'll end up with very dark scenes and TONS of noise.
And what makes Astro mode very important is the FRAME STACKING. Frame stacking reduces the overall noise and increases the "quality" of the image.
Deadmau-five said:
Just wondering if there's any way I can get an astrophotography time lapse greater than 1 second? I would love to have 60 seconds, but I know it would probably take 4 hours or something.
Just wondering if this is possible or there's any third party apps that might be able to do this (take a longer exposure than the 4 minutes that astrophotography takes)?
Click to expand...
Click to collapse
Not with stock camera.
You can try MotionCam Pro for that. It has a timelapse option where you can set your exposure time even to 15 seconds.
MotionCam is mainly for RAW video recording, but you can do photos and time-lapses. The output is absolutely GREAT. You're working with a RAW VIDEO basically and the quality is not comparable to ANY other app.
I had one Astro timelapse from it but I can't seem to find it now. It's sh**y weather outside now so can't do even a short one. I could do just a daylight one so you can see what quality I'm talking about here.
Uploaded a screenshot of the viewfinder. As you can see on the SS, you can adjust the ISO and shutter speed (among many other things) and do a timelapse.
This is basically taking RAW shots that you can later post process with various editing software like, Davinci Resolve, Adobe Premiere, Vegas, etc...
What you get is a video quality on the level of a DSLR and BETTER because there is no post-processing involved on the phone, it's basically RAW DNG images taken (sequence) that you can export (render) into a video at your QUALITY choice with YOUR post-processing involved.
Here is one sample I shot at and rendered to 4k60 (no color grading, just stock output).
Keep in mind that this is YOUTUBE, the quality of the original video is FAR better.
JohnTheFarm3r said:
You're talking about Aperture that is FIXED and completely irrelevant in this case. It's not like you have a variable aperture on the lens so you can adjust it.
What matters in his case is the shutter speed and the exposure time.
And no, normal timelapse WON'T work because the shutter speed will be low (fast) and the phone will try to compensate by pushing the ISO high. You'll end up with very dark scenes and TONS of noise.
And what makes Astro mode very important is the FRAME STACKING. Frame stacking reduces the overall noise and increases the "quality" of the image.
Click to expand...
Click to collapse
I know the aperture is fixed that's why i said it should work given the astrophotography mode time lapse is made up from the 16 images taken when the mode is active and not once the images have been stacked in to a single image. Given the way you talk you of all people should appreciate just how fast f1.85 is, not a single one of my Canon L lenses is that fast or even comes anywhere close to it.
The OP has nothing to lose by giving it a go before recommending extra software and shooting raw (it is raw BTW if we are getting picky, it isn't an acronym for anything).
MrBelter said:
I know the aperture is fixed that's why i said it should work given the astrophotography mode time lapse is made up from the 16 images taken when the mode is active and not once the images have been stacked in to a single image. Given the way you talk you of all people should appreciate just how fast f1.85 is, not a single one of my Canon L lenses is that fast or even comes anywhere close to it.
The OP has nothing to lose by giving it a go before recommending extra software and shooting raw (it is raw BTW if we are getting picky, it isn't an acronym for anything).
Click to expand...
Click to collapse
Where did I say ANYTHING against the fixed aperture of F1.85? I just said that since it's fixed, it's not relevant to the "settings" he uses since he CAN'T change the aperture value anyway.
It's not about "losing" anything, it's about the technical part of understanding that your recommendation won't work because it doesn't use long exposure shutter speeds or frame stacking.
By NOT using frame stacking, the noise will be horrible and there is little much you can do with post-processing without killing completely the "details" on the photo by suppressing both luma and chroma noise.
Another thing is that regular timelapse doesn't push long exposures...It's just not meant to be used for "astro", that's all.
Erm ok fella but how do you think this was all done before Google and its wonderful computational photography came along?
My point about the aperture is it is very fast so it being fixed is not irrelevant at all given it is the only chance of this even working, the OP may have tried it at 0.5x or 5x where the apertures are much slower, the OP has absolutely nothing to lose by giving it a go, it might be crap, you might end up with only the brightest objects in the sky, you might end up with a noisy mush and yet it might be good fun who knows.
Sadly there is always one person that comes along and stomps on the parade because they know best though isn't there?
MrBelter said:
Erm ok fella but how do you think this was all done before Google and its wonderful computational photography came along?
My point about the aperture is it is very fast so it being fixed is not irrelevant at all given it is the only chance of this even working, the OP may have tried it at 0.5x or 5x where the apertures are much slower, the OP has absolutely nothing to lose by giving it a go, it might be crap, you might end up with only the brightest objects in the sky, you might end up with a noisy mush and yet it might be good fun who knows.
Sadly there is always one person that comes along and stomps on the parade because they know best though isn't there?
Click to expand...
Click to collapse
It was done in a way that results were not even close to what we have today. Why use "outdated" methods when we have these VERY capable devices?
The app I suggested is great and has exactly what is he looking for.
Your logic of "How did we do this before XY time" is equal to "Let's just ride horses instead of cars because that's how we did it before". lmao

Categories

Resources