If you don’t know much about the technology, HDR can sound like a useless boondoggle, abstractly promising “better picture quality,” much like 3D TVs promised to be immersive. HDR is actually an incredible technology that can accomplish visually spectacular things, but it can also be much more complicated than it needs to be. There are a number of confusing certifications for HDR and many more products that simply list HDR as a feature without any further information readily accessible.
We’re here to tell you what you need to know about HDR, not just to understand a cool piece of tech, but so you can make an informed purchase of products that don’t make it easy to understand what HDR support actually means.
What Is HDR?
HDR, or high dynamic range, is quite possibly the most significant development in video quality since the jump from SD to HD. What the technology does is allow for both video and still images to have far greater brightness, contrast, and color accuracy. When you look at HDR content, what draws the eye is the internal contrast: the ratios of white to black onscreen. In effect: darker parts of an image will look incredibly dim, while the brighter regions will appear brilliantly lit.
As you may expect, HDR is largely a function of brightness, contrast, and color depth, though the technology gets more complicated with dimming zones and OLED panels.
HDR specifications, like HDR10, HDR10+, or Dolby Vision, don’t mean much on their own, largely corresponding to maximum brightness, not specifying minimum barriers for a quality experience, and are not always provided.
HDR 10 vs. HDR 10+ vs. Dolby Vision
Right now the primary version of HDR available on most TV sets is called HDR 10, with the 10 referring to the 10 bits of color that HDR is capable of displaying. But much like the format wars that have been raging since the days of VHS vs. Betamax or Blu-Ray vs. HD-DVD, this time around there are competing technologies that are doing everything they can to win your entertainment dollars.
The main competitor right now is Dolby Vision, which achieves many of the same perks of HDR 10, while also including the added bonus of dynamic scene-by-scene color shaping. While the two technologies are inherently the same (providing better contrast between light and dark while enhancing color), Dolby Vision lets cinematographers and colorists in the film industry change the level of HDR applied to each scene dynamically, rather than a single color spec being consistent throughout the entire show or movie.
Another important distinction is that while sets equipped with Dolby Vision are backwards compatible with HDR 10 (and therefore are capable of displaying both types of content), HDR 10 sets can only handle HDR 10.
That said, currently a new open standard is in the works called HDR 10+, which brings all the same benefits you get with Dolby Vision (dynamic scene coloring) while also being a cheaper technology for TV manufacturers to add to their sets.
How Do You View HDR?
Right now, the two primary ways to watch HDR-enabled content are either through a UHD 4K Blu-Ray player (all 4K Blu-Rays must have it enabled by default) or via streaming services like Netflix, Amazon, and Vudu on a compatible device.
Netflix has already found itself embroiled in the middle of the format war, with popular shows like Glow (a very colorful title in its own right) only working with Dolby Vision sets, while other Netflix originals are only displayed in HDR 10.
What to Look for in HDR Support
When considering a monitor or a TV, many products will simply advertise themselves as supporting HDR, or will maybe mention HDR10, but you’ll have to do a little digging to understand how HDR works on a specific screen.
An Amazon search for “hdr monitor” will bring up BenQ’s monitor lineup, and scrolling down to look at the comparison chart is confusing: one monitor supports HDRi, one supports HDR10, and one supports DisplayHDR 400. What do these specs mean? In short, not much.
A trip to BenQ’s website only shows the following:
- the monitor with HDRi support has a maximum brightness of 350 nits
- the monitor with HDR10 support has a maximum brightness of 300 nits with worse dynamic contrast
- and the monitor with DisplayHDR 400 has a maximum brightness of 400 nits with the same dynamic contrast as the HDRi monitor.
As you can see, what HDR specification a manufacturer decides their product supports does not matter very much, and monitors with what seem to be a higher HDR spec don’t actually offer much better HDR performance.
A good rule of thumb when it comes to determining if a screen is going to give you a good HDR experience is brightness: look for 1,000 nits as a maximum brightness.
With 1,000 nits, your screen can get bright enough to truly pull off the HDR effect, simulating deep, dark blacks and stupendously bright whites. The easiest way to cut corners with an HDR display is to limit the panel’s maximum brightness, and this is why panels that manage to reach 1,000 nits will often come with the appropriate specification, like HDR10+ or Dolby Vision.
Once you have your HDR needs attended to, make sure to actually enable the effect if you’re on a computer!
Are you disappointed with how HDR is being marketed? Have you been confused by screens promising HDR but not delivering a noticeable improvement? Let us know in the comments down below!
Our latest tutorials delivered straight to your inbox