• After 15+ years, we've made a big change: Android Forums is now Early Bird Club. Learn more here.

HDR photography app?

I didn't know what a HDR photo is until now {Google is my friend} and I must say it's very interesting indeed!

I've had a look on the Market too but couldn't find anything , A good suggestion however is to contact the developers of some camera apps and ask them to include them , Maybe even contact Picsay and ask them. This is a Feature I would definitely like

Carl
rudolf.gif
 
Upvote 0
You take a high contrast and low contrast pictures and it merges the two together for the hdr photo...


Actually, it has nothing to do with contrast. HDR (High Dynamic Range) photos take a series of shots (most commonly 3 shots) in series with different iso settings and then blends them together to add more color, depth, and range to photos.

Here are some examples...

2506567487_99480544cb.jpg


original-comparison.jpg




And here are some other HDR photos...

20080814151052_london_hdr_night.jpg


Boston_At_Night_HDR.jpg


dublin_photowalk_ship_hdr.jpg





HDR can also be used (just by tweaking the adjustments) to add effects to pictures to make them look surreal.

Examples of such...

hdrnight-viewfinder.jpg



2357415727_231b1a69dd.jpg


set01hdri5from__dsc3596.jpg







Now, I know you are checking the dates of the posts at this point and thinking "This post is more than 3 months old... why the hell was it bumped??" The short answer... because an HDR app, IMO, could be an enormous step forward for a lot of android phones that have absolutely horrible cameras, yet no devs have currently stepped forward to give this a shot and create an app to do so. At the risk of sounding like a fanboy (and I assure you, I am not)... I know of quite a few HDR apps for the iPhone that do this. Why don't we have one for android yet?
 
Upvote 0
HDR on a mobile phone has a couple of things going against it:

First, and most important, is whether or not the phone's API supports exposure compensation. I just went through the Android reference, and it doesn't look like it does. That doesn't mean it won't be added in the future, but it might also be that the hardware makes the call about exposure and the OS doesn't get any say over it.

Second would be the rate at which the phone's camera can be made to shoot a sequence of photos with exposure changes between shots. That's going to be important because the camera needs to stay put during the entire sequence or the images won't combine properly.

My SLR can reel off eight frames per second or 3/8 second to do a sequence of three shots. That's an ideal-world figure because it doesn't factor in the time the shutter is open on longer exposures or other delays like time spent allowing the mirror to settle to keep vibration down. The rule of thumb for blur-free hand-holding is that the exposure time is that you don't want the shutter open any longer than the reciprocal of the focal length of the lens. (For example, if you have a 50mm lens, you don't want to go any longer than 1/50 second.) So for a movement-free sequence that lasts 3/8 second, you'd need a 35mm-equivalent focal length of 8/3 or 2.6 mm, which is insanely wide. You can't use lenses with image stabilization, either, because the correction they apply doesn't guarantee the image will be in the exact same position every time.

I doubt the camera in the phone can move that fast. Figure at best two frames per second under the control of the CPU, and you're into 1.5 seconds. My SLR weighs a ton and can sometimes be hand-held for long periods, but light things like phones don't lend themselves to remaining stable.

Sorry for the long-winded explanation.

--Mark
 
Upvote 0
HDR on a mobile phone has a couple of things going against it:

First, and most important, is whether or not the phone's API supports exposure compensation. I just went through the Android reference, and it doesn't look like it does. That doesn't mean it won't be added in the future, but it might also be that the hardware makes the call about exposure and the OS doesn't get any say over it.

Second would be the rate at which the phone's camera can be made to shoot a sequence of photos with exposure changes between shots. That's going to be important because the camera needs to stay put during the entire sequence or the images won't combine properly.

My SLR can reel off eight frames per second or 3/8 second to do a sequence of three shots. That's an ideal-world figure because it doesn't factor in the time the shutter is open on longer exposures or other delays like time spent allowing the mirror to settle to keep vibration down. The rule of thumb for blur-free hand-holding is that the exposure time is that you don't want the shutter open any longer than the reciprocal of the focal length of the lens. (For example, if you have a 50mm lens, you don't want to go any longer than 1/50 second.) So for a movement-free sequence that lasts 3/8 second, you'd need a 35mm-equivalent focal length of 8/3 or 2.6 mm, which is insanely wide. You can't use lenses with image stabilization, either, because the correction they apply doesn't guarantee the image will be in the exact same position every time.

I doubt the camera in the phone can move that fast. Figure at best two frames per second under the control of the CPU, and you're into 1.5 seconds. My SLR weighs a ton and can sometimes be hand-held for long periods, but light things like phones don't lend themselves to remaining stable.

Sorry for the long-winded explanation.

--Mark


Hmm. A bit off topic, but which SLR do you have? As far as D-SLR's, I only know of two in existence capable of 8fps at full resolution. The Canon EOS D7 and Nikon D2h.


In any event, there are a couple of things in your post which just don't make any sense at all. True enough, you would have to hold the camera perfectly still for the whole sequence shot, but during daytime shooting, that shouldn't be overly tough. The lower the light, the more challenging it gets.

This part...

but it might also be that the hardware makes the call about exposure and the OS doesn't get any say over it.
Makes zero sense. Even with use of manual controls on DSLR and SLR cameras alike, the cameras rely on the user input to adjust settings. On auto, the electronics regulate controls. There is no physical way what-so-ever for a mechanical item to adjust settings to compensate for changing conditions. Therefor, there MUST be something within the app/api itself which controls it.

The second thing I feel the need to point out is that this is a phone app. Granted, HDR photography is all basically the same... but you are bringing way too much technicality into something relatively simple since, as you yourself pointed out, it is just a phone camera. Don't get what I'm saying wrong... I am an avid photographer as well (judging by your 8fps SLR comment... I am going to assume you have a $1,500+ SLR, meaning you are heavily into photography). All I am saying is that your post essentially would be the same as someone looking at an app like that stupid little autotune app for iPhone and comparing it to $1,000+ Autotune hardware/software used in recording studios. You see what I mean? While they are essentially the same, you'll never really end up with REAL hdr photos like you can take with a quality DSLR. But it doesn't change the fact that it could still be done (as it has been done with the iPhone), and whatever the outcome, would most likely be a major step up in quality from regular photos taken with the phone.

Here are a couple of pictures floating around the web that compare stock iPhone photos to iPhone photos taken with an HDR app.

ss0.jpg


HDR-image.jpg


hdr-iphone-screenshot3.jpg




Granted... even the HDR version isn't up to par with a real camera... but just look at the difference between the stock and hdr taken with the phones. IMO, it would be worth it. It would still make it better.


Here is an interesting little video too. (again... iPhone).

http://www.youtube.com/watch?v=K-sRaa_gSIA
 
Upvote 0
Hmm. A bit off topic, but which SLR do you have? As far as D-SLR's, I only know of two in existence capable of 8fps at full resolution. The Canon EOS D7 and Nikon D2h.

I'm currently shooting a D300, which can do 8 FPS if you have an EL-4a battery installed in the vertical grip. The D700 can do the same, the D3 can do 9 and the D3s can do 11 if you turn off AF tracking. I'm pretty sure the current iteration of the Canon 1D is in the same neighborhood, but being a Nikon guy I don't keep that close track of what Canon is up to.

And I thought my D1 could blast 'em out at 4.5 per second... :rolleyes:

True enough, you would have to hold the camera perfectly still for the whole sequence shot, but during daytime shooting, that shouldn't be overly tough. The lower the light, the more challenging it gets.
I brought frame rate into it because you'd have to hold the camera still from the start of the first exposure until the end of the last or the images won't line up. On the other hand, the short focal length will keep blur down and misalignment can be corrected for in software as at least one of the iPhone apps in the video you posted appears to do.

The pitfall with me is that I started shooting pictures as a kid in the mid 1970s, learned to do developing and printing not long afterward and will have been doing digital for a decade later this spring. I still do some things the old-fashioned way, even when they're more work and I know somewhere in the back of my head that we can do lots more stuff of this stuff in software.

Even with use of manual controls on DSLR and SLR cameras alike, the cameras rely on the user input to adjust settings... There is no physical way what-so-ever for a mechanical item to adjust settings to compensate for changing conditions. Therefor, there MUST be something within the app/api itself which controls it.
You're going on the assumption that the ability to manipulate those things is brought all the way out to a place where Android applications can get at it. Take a look at the Android camera API (look at the Camera.Parameters class) and you'll see that there's a limited list of things an application can tweak, none of which cover exposure control.

My Eris runs HTC Sense, and its camera application has a "brightness" adjustment that does appear to adjust the pre-A/D gain up and down. What it doesn't do is adjust the metering according to where the focus point is, which is something the iPhone's camera does and is what makes applications like TrueHDR possible.. At any rate, this exposure adjustment is a Sense-specific feature, and even if you could write a program to use it (HTC isn't forthcoming with documentation or source code), it would only run on phones with Sense.

I'd be interested to know if the metering on phones running stock Android behave the same way.

The second thing I feel the need to point out is that ... it is just a phone camera.
You have a very good point. You're right; I was overcomplicating it. People do expect a lot more out of lower-end gadgets than they used to, and sometimes they deliver. I got the impression the OP (and you, actually) were after something where you could press the button, have the camera blast off three exposure-compensated frames and process them into a high-quality HDR image, all without any additional interaction.

Come to think of it, I'd love that myself.

--Mark
 
Upvote 0
You're going on the assumption that the ability to manipulate those things is brought all the way out to a place where Android applications can get at it. Take a look at the Android camera API (look at the Camera.Parameters class) and you'll see that there's a limited list of things an application can tweak, none of which cover exposure control.

google could add api support for stuff like exposure control in froyo or gingerbread or whatever comes after
 
Upvote 0
I'm currently shooting a D300, which can do 8 FPS if you have an EL-4a battery installed in the vertical grip. The D700 can do the same, the D3 can do 9 and the D3s can do 11 if you turn off AF tracking. I'm pretty sure the current iteration of the Canon 1D is in the same neighborhood, but being a Nikon guy I don't keep that close track of what Canon is up to.

I don't like you. Get out... and leave your camera here when you leave. :D I want a new Nikon sooooooo bad. I am still shooting with an Olympus E510 which is, in its own rights, a great camera... but I am a Nikon guy too. I went with the Oly because for the price range ($500), it had better manual controls, better quality build, more features, and much better glass than anything in its range. My next camera will be a Nikon, though. (Back to Nikons. :))


I brought frame rate into it because you'd have to hold the camera still from the start of the first exposure until the end of the last or the images won't line up. On the other hand, the short focal length will keep blur down and misalignment can be corrected for in software as at least one of the iPhone apps in the video you posted appears to do.

Right... all I was pointing out, though, is that this wouldn't really come into play with an HDR app for these cameras. The sensors are so insanely small anyway that they really don't work well in dimmer light, and extended times wouldn't be needed in daylight. It would actually happen very quickly, so I think the risk of screwing up the shot by moving would be absolute minimal.

The pitfall with me is that I started shooting pictures as a kid in the mid 1970s, learned to do developing and printing not long afterward and will have been doing digital for a decade later this spring. I still do some things the old-fashioned way, even when they're more work and I know somewhere in the back of my head that we can do lots more stuff of this stuff in software.

You have me beat by a decade. (give or take a few years). I started photography around 86ish. (I was 5).


You're going on the assumption that the ability to manipulate those things is brought all the way out to a place where Android applications can get at it. Take a look at the Android camera API (look at the Camera.Parameters class) and you'll see that there's a limited list of things an application can tweak, none of which cover exposure control.
You are right... that is what I am assuming. Or at least, I am assuming there is something somewhere that you can tweak to make this work. Looking at the api won't help me, though. It will be like trying to read a Russian forum, I wouldn't have the first clue what I am looking at. I have a lot of ideas, but unfortunately I am not a developer. That's why the best I can do is to post my ideas on the forum like I have this one. And to be fair... this wasn't even really MY idea. It was stolen from the iCrap collection.

My Eris runs HTC Sense, and its camera application has a "brightness" adjustment that does appear to adjust the pre-A/D gain up and down. What it doesn't do is adjust the metering according to where the focus point is, which is something the iPhone's camera does and is what makes applications like TrueHDR possible.. At any rate, this exposure adjustment is a Sense-specific feature, and even if you could write a program to use it (HTC isn't forthcoming with documentation or source code), it would only run on phones with Sense.

I'd be interested to know if the metering on phones running stock Android behave the same way

I still maintain that there HAS to be a control for metering. Otherwise there would only be one setting. You would end up with black pictures on overcast days and overexposed pictures on sunny days. And if there is a way for the phone to adjust metering... which their has to be... then I would think there has to also be a way to gain control of it through an app or rewriting of some part of the source code.

You have a very good point. You're right; I was overcomplicating it. People do expect a lot more out of lower-end gadgets than they used to, and sometimes they deliver. I got the impression the OP (and you, actually) were after something where you could press the button, have the camera blast off three exposure-compensated frames and process them into a high-quality HDR image, all without any additional interaction.

Come to think of it, I'd love that myself.

--Mark

lol yeah, I figured that was where you were coming from. Like you said, though... I doubt 3 pictures would be possible. It surely wouldn't be high quality (as we know it) in any event. But if it improves picture quality, then it would still be a much needed app for my collection. :D
 
Upvote 0
I don't like you. Get out... and leave your camera here when you leave. :D

Okay, but it might be a D1. Or an N50. Yeah, how'd you like an N50? :D <Slams Door>

I still maintain that there HAS to be a control for metering.
Let me put on my software and hardware guy hat for a minute:

Mobile phone cameras don't have irises or shutters, so exposure is managed through a combination of changing how long the sensor is allowed to accumulate light after it's been reset and the amount of gain between the signal coming off the sensor and the analog-to-digital converter that digitizes it. (Think of it as an electronic volume knob.) DSLRs work much the same way, but the iris and shutter precisely meter how much light falls on the sensor before it's read.

The phones themselves are built around an integrated circuit (IC, or a "chip") that has the CPU and many of the peripherals built right in. These include everything from the USB port to the keypad interface to the display controller, camera, GPS, audio hardware, etc. One such chip is the Qualcomm MSM7600, which you'll find in my Eris and a bunch of HTC's other products. Part of the chip includes the camera interface. If what's built into the MSM7600 is anything like what's available in standalone camera-on-a-chip ICs (which I surveyed at this evening), you connect the sensor and the camera section of the chip takes care of pretty much everything. In other words, the camera is functionally separate: you tell it to send preview to the LCD or to take a picture, and a fully-encoded JPEG gets sent back to the CPU.

We do know that the gain on the MSM's camera can be twiddled by the CPU because the Sense implementation does it. I wouldn't be surprised if the metering point can be changed, too. Android as it stands now has been kind of dumbed down to the lowest common denominator, and that's what we're stuck with. The best we can do is to pester Google to expand the interface a bit and make sure the phone manufacturers know we want the camera controls brought out into Android.

--Mark
 
Upvote 0
I saw one of these on the iphone. Its a pretty cool app. Did a search in the market and I did not see anything.

You take a high contrast and low contrast pictures and it merges the two together for the hdr photo...

I've just released an app which does something similar. It's not as pronounced as full-on Photomatix style HDR tone-mapping, but uses similar techniques to bring out detail in single exposures. For most images it improves the image without it looking processed.

Brand new -- released on Friday -- search for Photo Enhance in Android Marketplace.

Cheers,

Mark
 
Upvote 0
I just tried the Vignette version, it doesn't have any options and you can't zoom in to look at the pic details. I downloaded the free Photo Enhance version and will see how it develops maybe I'll buy it too, I also own PicSayPro and together all these apps give a lot of power to the mobile user :) I
 
Upvote 0

BEST TECH IN 2023

We've been tracking upcoming products and ranking the best tech since 2007. Thanks for trusting our opinion: we get rewarded through affiliate links that earn us a commission and we invite you to learn more about us.

Smartphones