# High Dynamic Range: What Is It and Why You Should Care



## Todd Anderson (Jul 24, 2009)

One of the more interesting display related technologies highlighted during CES 2015 is High Dynamic Range (otherwise known as HDR). We’ve heard this term tossed around in the past, but this year’s show marks the first time it has been in the spotlight. Last year, the industry decided to go all-in on a 4K future, leaving us to wonder: Do we really need all of these extra pixels? Can we actually see a difference from our couches? The answer to those questions is a solid “maybe.” After all, the human eye has innate limitations and boatloads of pixels do not automatically equate to better picture quality. This has left manufacturers scrapping and scrambling to find reasons to differentiate their UHD pixels from those found on HD televisions. High Dynamic Range is something that might just tip the scale.










On the simplest of levels, HDR is contrast on steroids. Highlights and whites are significantly brighter and blacks are, well, blacker. Details are more visible within the brightest and darkest scenes. High Dynamic Range also allows for more colors to appear on a screen. Colors that were previously washed out by whites and blacks are given a chance to be expressed within HDR images; it’s shades galore.

A recent article in Variety says that Hollywood has been ho-hum about 4K, but is literally going crazy over the prospects of image enhancements offered by HDR. The publication interviewed Richard Welsh (CEO of Sundog Media Toolkit), who said moviemakers think HDR has an opportunity to actually impact the creative process well beyond extra pixels. The difficulty, Welsh points out, is selling the concept to consumers. In many regards, Welsh is right on target. It’s easy for retailers to tag and identify televisions by no-brainer differentiators such as pixel counts. Identifiers like “4K” or “UHD” are simple indicators that consumers can quickly grasp and apply in a purchase oriented decision process. But talk with your average consumer about differentiators on current display technologies (local dimming, for example) and their heads are sent spinning; they want simple. Is HDR something that consumers will readily grasp…or even care about?

Multiple manufacturers arrived at CES 2015 with HDR loaded on a few models, including Sony, Samsung, Panasonic and TCL, while LG showed-off a prototype HDR OLED display. Not surprisingly, some manufacturers have morphed “High Dynamic Range” into fancy tech names. For example, Samsung refers to their HDR televisions as “SUHDs,” Panasonic calls the tech “Dynamic Range Remaster,” and Sony is labeling it “X-tended Dynamic Range.”

So we have the tech, but what about content? According to Gigaom, Dolby has once again emerged on the content side with their licensed Dolby Vision technology. The Dolby Vision suite of post-production tools allows movie houses to encode HDR video and display it on Dolby Vision certified televisions. The trick, Gigaom says, is that leaning on Dolby adds a licensing fee to the loop. So far Philips, Hisense, TCL, and Toshiba have agreed to terms with Dolby, along with Warner Bros. (movies) and Amazon, Netflix, and Vudu (streaming content). Look for “Edge of Tomorrow,” “Lego Movie,” and “Into the Storm,” to stream with HDR soon. Netflix has promised to produce some content in HDR sometime during 2015 and, according to CNET, has also partnered directly with LG and Sony. 

Much like our preview of Panasonic’s UHD Blu-ray Player concluded, we’re once again seeing evidence that the arrival of UHD might very well extend beyond the inclusion of extra pixels. Radically unique image improvements might just be reason enough to begin considering a UHD purchase sooner than most had assumed.

_Image Credit: Samsung_


----------



## JimShaw (Apr 30, 2012)

*I do not have anything to say. I am drooling too much for what is coming*


----------



## rambocommando (Aug 28, 2014)

So does this have any implications for projectors? Is this being accomplished with an increase in the bit depth for color?


----------



## Talley (Dec 8, 2010)

Intel did the same thing.... there was a GHZ war on speed and then they started to produce multi-core processors to further increase. 

This is essentially the same. more pixel more pixel and now they gotta bring the color and contrast into the picture.

I do alot of photography and cameras are the same thing also... megapixel wars then you bring in multi layer fovean type sensors that capture light color individually instead of filtering it.

More pixels can end up bringing better picture. more colors will have a subtle effect. I think more contrast would help around the same that more pixels does.

In the end... you combine all together and then you have a real top notch product. I hope that is what comes of all of this. 

Quantum dots and HDR... BRING IT BABY!


----------



## Todd Anderson (Jul 24, 2009)

rambocommando said:


> So does this have any implications for projectors? Is this being accomplished with an increase in the bit depth for color?


Two good questions.

First off, HDR is really an improvement across brightness, contrast, and color. It's achieved through a brighter picture (in the case of LCD TVs, that means you add more LEDs....for OLED it simply means you apply more energy). To compensate for the presence of elevated brightness, you need to apply enhanced contrast through better local dimming (for LCD TVs). Then there's color...which is being accomplished (as you pointed out) through an increased bit depth and larger color gamut.

The above is a generalized summary of how it works...but you should get the picture.:T

Second, yes, I do believe (and I could be off mark, here) that we will eventually see HDR trickle to home projector applications. However, brightness capabilities of current home projectors likely will limit its widespread feasibility for some time. Dolby is currently preparing to deploy HDR in a few select commercial theaters that are running high light output dual Christie 4K Laser Projectors. How that eventually translates to the home environment remains to be seen.


----------



## Todd Anderson (Jul 24, 2009)

Bmxer241 said:


> Intel did the same thing.... there was a GHZ war on speed and then they started to produce multi-core processors to further increase.
> 
> This is essentially the same. more pixel more pixel and now they gotta bring the color and contrast into the picture.
> 
> ...


I'm not well versed in the photography world, but I believe HDR in photography and HDR in display technologies are two very different animals. If you run a quick google search, you'll probably find something describing why.


----------



## Talley (Dec 8, 2010)

Todd Anderson said:


> I'm not well versed in the photography world, but I believe HDR in photography and HDR in display technologies are two very different animals. If you run a quick google search, you'll probably find something describing why.


without googling anything I will say that there is some aftermarket firmware capabilities on Canon (which is what I shoot with) that under normal process the vertical lines and horizontal lines are both processed at the native ISO that you are requesting. however, with the aftermarket firmware you can call for a separate ISO for vertical and another for horizontal thus creating a one shot HDR style format. So essentially you can boost the shadow capability while maintaining your highlights. Canon cameras at base ISO come with around 12.5 stops of dynamic range. The human eye can detect 18 stops. I'm not exactly sure how many stops of dynamic range a TV has to offer but I do know it's not the same as a human eye. With this trick you can bump the dynamic range up to around 14 stops. (still not as good as Sony's 15 stop capable sensor) 

The kicker?...

you can use this trick during video too. It's literally called HDR video since it effectively increases the dynamic range.

Now how this relates to video playback and display technologies I have no idea. But it's essentially the same concept I'm sure.


----------

