iOS's auto HDR feature seems really good to me.
I'd love to be confident of being able to replicate as good of an effect using nikon raw images.
When I've done HDRs from my camera in the past in photoshop I've been less than impressed. But it must be possible.
I can't see how an iPhone can actually physically have more dynamic range than a dSLR so it must be some clever manipulation it's doing in software. Is it?
So what is it doing?
For instance this picture:
https://www.flickr.com/photos/147214244@N02/50613197171/in/album-72157716922662538/is taken into the sunset, but the grass in the foreground is green.
If you were taking that picture with a dSLR:
How many stops difference would you allow, -2/0/+2 ? -3/0/+3, or maybe even more than 3?
What software would you use to do the HDR merge, any recommendations for what does it better than photoshop?
What other post processing would you do?
This one
https://www.flickr.com/photos/147214244@N02/50613306807/in/album-72157716922662538/exhibits a similar effect but not quite to the same degree
and this one
https://www.flickr.com/photos/147214244@N02/50613198381/in/album-72157716922662538/ pretty much looked that dramatic to the naked eye (heavy but broken cloud allowing the ground to be lit)
A valid point is "why not just use the iPhone if you want photos like that" to which there are several answers, 1, I do, for some, 2, I like tweaking the settings, 3, I like using different lenses with different zooms, 4, dSLR image has more pixels so better to print (I think) 5, I want the learning experience