HDR is a common word you encounter whenever you purchase a TV, smartphone, video streaming devices, or gaming consoles.

As if HDR is not enough to confuse people, you have terms like HDR10, HDR10+, Dolby Vision, and HLG thrown at you with amazing regularity.

What exactly is HDR, and how does it affect your viewing? One has to go to the basics to understand the HDR concept and the differences between the formats mentioned above.

The Concept

The common school of thought says that the more the number of pixels, the better is the viewing experience. HDR is something beyond this concept as it aims at creating better pixels instead of having more pixels.

The focus is on allowing higher variation in light output levels while ensuring that you view a broader color range.

To understand what HDR is, you should have an idea about the Dynamic Range concept. Though this part can get quite technical, it is essential to understand this concept to make it easy to compare between the various HDR formats.

Dynamic Range

In simple terms, dynamic range is the difference between the lowest and highest values anything can produce. In our case, we discuss light. Thus, dynamic range is the difference between the brightest and the dimmest light your device can produce at a given setting.

The dynamic range for luminance is measured not in absolute but relative values. It is measured in terms of stops, with each stop having a dynamic range value double the previous stop’s light output.

The concept will become clear as we understand Standard Dynamic Range (SDR) and High Dynamic Range (HDR).

What is Standard Dynamic Range?

The typical digital video that you see on your TVs and other devices is SDR video. A simple example is that of the standard HD Blu-ray disc. Various parameters go on to define SDR. Let us discuss them in brief.

Video Format

Generally, all HD Blu-ray discs have H.264 codec, the most commonly used video format globally. SDR displays 8-bit color with each of the RGB primaries allocated with 8-bits of color information. Thus, you have a total of 24-bit color info that leads to 16.77 million color combinations (Number of colors = 2^n, where n=24).

Color Space

The range of colors visible to the human eye defines the color space. SDR content is mastered in the Rec.709 color space that has a gamma of 2.4.

In contrast, sRGB has a gamma of 2.2. Gamma (Electro-Optical Transfer Function for SDR – EOTF) represents a mathematical equation that converts incoming electrical signals into visible color info.

In simple terms, a change in the gamma value affects the color values. For instance, changing the gamma value from 2.4 to 2.2 makes the image brighter.

Simultaneously, changing the gamma value to 2.8 makes the image appear darker.

Maximum Luminance

SDR content is mastered at 100 nits of peak brightness. It ensures that the brightness levels do not exceed 100 nits. If it does, it is clipped. SDR’s most significant limitation is the capping of the luminance value at 100 nits.

We have discussed that the luminance value is measured in stops. You can see that there can only be a maximum of six stops between 0 and 100 nits (100 is between 2^6 and 2^7). Thus, you have low contrast in SDR as the brightest values can only be around six times brighter than the darkest values.

It entails that you cannot produce lifelike images in SDR. As you step out on a sunny day, the objects reflect light with values more than thousands of nits.

The sunlight output is more than a billion nits. Even our human eyes cannot see the entire dynamic range of sunlight. It explains why we narrow our eyes in bright sunlight, and the iris opens up wide in darkness (say, shadows).

As SDR display cannot be brighter than 100 nits, it is impossible to display true lifelike colors in SDR output. Now, we come to the HDR concept.

What is High Dynamic Range?

Instead of using Gamma, HDR uses a new electro-optical transfer function known as Perceptual Quantizer (PQ transfer).

This function increases the peak luminance value to as high as 10,000 nits instead of SDR that caps it at 100 nits. It is 100 times that of SDR.

At the other end of the scale, HDR has a lower black point (peak darkness) value of 0.0001 nits compared to SDR that has 0.01 nits. Therefore, HDR can distinguish up to 100 times darker shades than the darkest SDR content.

In simple terms, we can define HDR as the ability to distinguish between the darkest of blacks and the brightest of whites comfortably. Thus, HDR has a massive dynamic range compared to SDR to make the images more realistic.

What do the higher limits of HDR signify?

One should also understand that the 10,000 nits of brightness are still out of reach of the existing display technology. The maximum level of brightness that the most expensive monitors can reach today is 4000 nits. Most TVs available today cannot exceed 1000 nits.

SDR vs HDR

Despite the limitations, the higher HDR limits allow content creators to have more freedom when mastering content.

Many people confuse that HDR makes the picture appear much brighter. It is not the case. HDR allows more flexibility when it comes to creating content.

For instance, when shooting a scene in a dark environment, the photographer has the freedom to have the lamp at brightness levels of 800 nits or even 2500 nits, if necessary. It creates better contrast and produces lifelike colors.

There is no limitation for the content creator to cap the brightness levels at 100 nits. Thus, you can conclude that HDR supports Wide Color Gamut or WCG. While WCG is optional for SDR, it is very much an essential ingredient in HDR content.

Though most of the visible colors will be in the Rec.709 color space, HDR offers the option to venture into the wider Rec.2020 space and the narrower DCI-P3 color space.

While SDR offers 8-bit color, HDR provides a higher bit depth at 10-bits. The additional bits for each of the RGB colors result in less color banding and better color gradation. Thus, HDR can display up to 1.07 billion color combinations (2^30)

Now that we have understood the SDR and HDR concepts let us discuss the different HDR formats and understand the difference between them.

Different HDR Formats

HDR has several formats, of which four are the most relevant. They constitute HDR10, HDR10+, Dolby Vision, and Hybrid Log-Gamma (HLG).

HDR10

HDR10 is the basic HDR format you get today. Any HDR content or device is generally HDR10. As it has widespread adoption, almost every HDR-enabled device supports HDR10.

Just as HD Blu-rays support SDR, the UHD Blu-rays support HDR10. HDR10 is usually the base layer that supports backward compatibility. Almost all the streaming services like Netflix, Amazon, YouTube, Disney+, etc., offer HDR10 as a default format.

The most significant advantage HDR10 has is that it is royalty-free. It is a strong format capable of supporting luminance values up to 1000 nits with Rec.2020 WCG support. It offers a 10-bit color depth, whereby it gets the name HDR10.

HDR10 has two limitations, the 1000-nits luminance value ceiling and the use of static metadata.

Metadata is an instruction manual that includes information like, Maximum Frame Light Level and Maximum Content Light Level. Your TV or the receiving display uses this information to adjust the brightness levels for the content.

As HDR10 uses static metadata, the values remain constant throughout the runtime. It can produce inconsistent images, where some scenes are not as bright as they can be, and some might be brighter than what they should be.

HDR10+

While HDR10+ is similar to HDR in many aspects, it resolves HDR10 limitations concerning the luminance value ceiling and metadata. HDR10+ increases the maximum luminance values to 10,000-nits and supports dynamic metadata.

Samsung is the pioneer in introducing HDR10+ technology. Panasonic and 20th Century Fox later joined Samsung.

Dynamic metadata allows content creators to include information on a frame-by-frame or scene-by-scene basis to adjust the brightness levels. Thus, your televisions can adjust their output to match the content perfectly and reproduce images as close to reality as possible.

There is no question of clipping the luminance value at 1000 nits because HDR10+ allows the freedom to go further, even up to 10,000-nits.

The advantage of HDR10+ content is that it is backward compatible with HDR10 devices. Performance-wise, HDR10+ should be better than HDR10. The non-HDR10+ devices ignore most of the differences within the metadata. Hence, there is not much difference when watching HDR10+ content on a device that supports HDR10. It is more like watching an HDR10 video.

While being royalty-free, HDR10+ compatible device manufacturers have to pay the annual administration fee ranging from $2500 to around $10000, depending on the product.

Compared to HDR10, there is not much adoption of HDR10+ by content creators, service providers, and OEMs. The primary reason is that HDR10+ is more of a proprietary content with the founding companies and those in the HDR10+ alliance (Amazon) pushing for its inclusion. Even Panasonic has started offering products with Dolby Vision on their latest TVs. Century Fox is presenting its latest UHD Blu-ray discs in Dolby Vision.

It brings our discussion to another HDR format, Dolby Vision.

Dolby Vision

Dolby Vision is similar to HDR10 and HDR10+ as it is based on the Perceptual Quantizer EOTF and capable of displaying 10,000 nits of luminance. It differs from HDR10+ in the fact that it supports 12-bit color and Rec.2020 color space.

Dolby Vision is in existence before HDR10+. It is the first HDR format to support dynamic metadata.

Hence, it allows frame-by-frame and scene-by-scene tone mapping. As it supports 12-bit color, it can display a maximum of 68.72 billion colors (2^36).

 

Wihtout Dolby Vision
With Dolby Vision

OEMs and studios have adopted Dolby Vision by paying the licensing fees to Dolby. The royalty fee is in the range of $3 per TV manufactured by the company. It depends on the product manufactured by the company.

While the amount looks small, you have to consider the millions of products that the company manufactures. Hence, the licensing fees can run into millions of dollars.

While HDR10 remains the default HDR format, HDR Blu-ray discs with Dolby Vision support have an HDR10 base layer with a distinct Dolby Vision metadata layer.

Almost all the top television manufacturers except Samsung offer their high-end TVs with Dolby Vision support. Even Apple devices come with Dolby Vision support.

While streaming services do not have any issues with Dolby Vision, the film studios have specific limitations when including Dolby Vision on their UHD releases.

UHD Alliance forces film studios to have an HDR10 base layer and the Dolby Vision metadata layer to enable backward compatibility with older TVs not supporting Dolby Vision.

Some studios like Disney have come up with a unique strategy by releasing standard HDR10 content for physical media and reserving Dolby Vision for their online streaming.

Dolby Vision uses two modes. They comprise the standard model, also known as TV-led Dolby Vision and the low latency mode, referred to as Player-led Dolby Vision. The difference between this mode is subtle.

The TV mode does the processing and tone mapping activities on the television itself. In contrast, the player-led Dolby Vision mode does the mapping on the player. As a result, the TV mode delivers better images with increased latency, and the player-mode produces images with lower latency.

Almost all the TV manufacturers except Sony support the standard model, whereas Sony TVs support the low-latency mode. Gaming consoles like Xbox One X/S and Xbox Series X/S support the low latency mode.

Hybrid Log-Gamma (HLG)

HLG is a standard HDR format used by broadcasters to deliver HDR displays without closing the doors on SDR technology. HLG was introduced by BBC and NHK, a Japanese broadcasting company for broadcasting HDR content in a backward-compatible format that can be viewed on non-HDR compliant devices.

some of the big budget nature programmes like BBC’s Planet Earth II is in HLG

HLG uses a unique non-linear EOTF where the lower half of the signal uses a gamma curve, and the upper portion uses the log curve. It ensures that the video can be displayed on any television, irrespective of whether it supports HDR or not.

When you play such content on a standard television, it interprets the gamma curve portion and results in an SDR video. If you play the same content on an HDR TV, it produces the full HDR effect as it reads the logarithmic part of the curve.

The advantage of HLG is that it is royalty-free. However, the format is restricted to selected TV broadcasting channels alone.

This table simplifies the differences between these various HDR formats.

Parameters HDR10 HDR10+ Dolby Vision
Format / Standard Open Standard for HDR The royalty-free standard for HDR The proprietary standard for HDR by Dolby
Bit Depth 10-bit with 1.07 billion colors display 10-bit with 1.07 billion colors display 12-bit with 68.7 billion colors display
Peak Luminance Value Mastered up to 1000 nits but supports up to 10000 nits Mastered up to 4000 nits but supports up to 10000 nits Mastered up to 4000 nits but supports up to 10000 nits
Tone Mapping Tones extending past the TV range mapped using the PQ transfer function Tones extending past the TV range mapped using the PQ transfer function Tones extending TV range mapped by Dolby chip using PQ transfer function
Metadata Static Metadata Dynamic Metadata Dynamic Metadata
TV Support Excellent Limited to Samsung and some Panasonic models Limited because of high licensing fees
Content Availability Excellent Limited At present limited, but growing
Fees Nil Administrative fees ranging between $2500 to $10000 annually Depending on the product, charged per device manufactured

Here is the device support comparison for HDR10, HDR10+, and Dolby Vision.

Devices HDR10 HDR10+ Dolby Vision
UHD Blu-Ray Yes Yes Yes
Amazon Prime Video Yes Yes Yes
Amazon Fire TV Stick 4K Yes Yes Yes
Netflix Yes No Yes
Vudu Yes No Yes
Apple 4K Yes No Yes
Chromecast Ultra Yes No Yes
NVidia GTX 900 Series and upwards Yes No Yes
AMD Radeon RX and upwards Yes No Yes
NVidia Shield Yes No No

Let us look at some of the top smartphones and gaming consoles and compare the three HDR formats.

Devices HDR10 HDR10+ Dolby Vision
iPhone 8 Yes No Yes
iPhone X Yes No Yes
iPhone XS Yes No Yes
LG G6 Yes No Yes
Samsung Galaxy S10 Plus Yes Yes No
Huawei P20 Yes No No
PS4 and PS4 Pro Yes No No
Xbox One Yes No Yes
Nintendo Switch No No No
PC Yes No Yes

Concluding Thoughts

From the discussion, we can make out that Dolby Vision is the most advanced HDR format from a technical viewpoint. However. HDR10 is the standard format when it comes to content availability and device compatibility.

HDR10+ is a restricted format available only on Samsung devices. HLG has restrictions as it is available with selected broadcasting channels alone.

Though we have HDR technology capable of delivering the best viewing experiences, TV manufacturers and content creators have not yet mastered the titles entirely. There are limitations, but they can be overcome with improved technology.

However, HDR content is any day better than SDR content. It is a matter of time before we experience the full benefits of HDR technology.

guest
0 Comments
Inline Feedbacks
View all comments