Is it Real or Is It High Dynamic Range? How Software Is Changing the Way We Look at Photographs

You know how listening to music on a friend’s pricey Bose headphones makes it harder to tolerate your tinny little speakers at home, or watching your favorite show on a high-definition screen spoils you for regular TV? I’m at a moment like that in the way I look at photographs. For the last few weeks, I’ve been playing around with a new computerized technique called high dynamic range (HDR) photography, which can lend a stunning level of brightness, contrast, and detail to digital images. And now every traditional non-HDR image that I see looks flat and dull by comparison.

It’s a dilemma, actually, because the HDR “look” can be peculiar, artificial, even surreal. If you lived in a world where every photograph was made this way, you’d have a constant migraine. But for now, I’m a little bit addicted to HDR. And at the risk of getting you addicted, too, I want to talk this week about how the technique works, what you can do with it, and how it can help all of us question some of the conventions and expectations we’ve built up around the art of photography, and around the related art of looking at photographs.

HDR images are unusual because they don’t represent a single moment in time, like most photos, but rather are digital fusions of several images of the same scene, taken at different exposure levels. (In photography, the longer the exposure time, the more light gets captured by a camera’s film or digital sensor, and the brighter the resulting image.) To collect raw material for an HDR image, photographers generally take at least three pictures: one that’s underexposed, one that’s overexposed, and one at a normal exposure. This is called exposure bracketing. The easiest way to explain is to do a bit of show-and-tell:

1. Normal Exposure

1. Normal Exposure

2. Underexposed

2. Underexposed

3. Overexposed

3. Overexposed

Digital cameras have come a long way in the last 10 years, but the sensors inside them are still nowhere near as good as the human eye at handling the huge variations in luminance that occur in the natural world. (Photographers call this variation dynamic range.) As you can see from Photo 1 above—the one taken at the standard exposure level that my camera chose automatically—the trees look okay, but the sky is pretty washed out. That’s because the camera, in choosing an exposure that would capture some detail in the hills and leaves, wound up gathering too much light from the much brighter sky above.

The HDR process offers a way to compensate for this technological limitation. If you examine the underexposed image (Photo 2) above, you’ll notice that the landscape is pretty dark, but there’s a lot more detail in the clouds—you can actually see how shapely they are. Conversely, in the overexposed image (Photo 3), the sky is a featureless white blur, but you can see a lot more stuff happening in the trees—detail that was largely lost in the shadows in the normal exposure.

… Next Page »

Single PageCurrently on Page: 1 2 3 4

Wade Roush is a freelance science and technology journalist and the producer and host of the podcast Soonish. Follow @soonishpodcast

Trending on Xconomy

By posting a comment, you agree to our terms and conditions.

3 responses to “Is it Real or Is It High Dynamic Range? How Software Is Changing the Way We Look at Photographs”

  1. harmonicat says:

    My wife thinks that some of the example photos look like colorized hollywood movies. But I love the clarity and the experience. I’m just a dad who takes many photos of my kids in all kinds of activities. I’d love to be able to preserve those memories with the this kind of crispness — then the photo more closely matches the memory of what I saw through my own eyes.