What is HDR and why is it used in TVs and monitors?

Что такое технология HDR и зачем она нужна в телевизорах и мониторах?

The number of technologies affecting the quality of the image viewed our content continues to grow. Opening another video, do not always understand: in front of you, exactly what I was planning to show the author, or only what is able to display your technique and?

One of the most unpleasant situations is when a user downloads a movie in the new format, which should look higher quality than usual, and actually gets dull and fuzzy picture. At first, it is possible not to pay special attention to unfamiliar abbreviations HEVC, HDR, 10bit, etc. However, getting on the screen less bright and painted in all shades of yellow (usually dominated by this color) video, there is a misunderstanding. Most often with a similar encounter while trying to run a movie or video game in HDR mode on a device that does not support this technology or support, but was not pre-configured.

Theory

Thanks to new technologies today, video quality is improved not only through higher resolution and, consequently, the number of pixels, but also thanks to the expansion of the color gamut (WCG, Wide Color Gamut) and dynamic range (HDR, High Dynamic Range). Image quality can also be affected by increasing the frame rate, but the most notable remains the use of technology high dynamic range (HDR). And, contrary to popular belief, it works equally well as displays with ultra-high resolution (4K and 8K), and on devices easier (Full HD). At the same time, it was HDR in most cases becomes the reason of problems described in the introduction. By the way, partly to fix them even on a conventional monitor, sometimes it is enough to update codecs or change player software (if it is on PC).

The HDR technology is designed to make bright scenes brighter and dark – a deeper, allowing you to add naturalism in high contrast scenes, for example, when the protagonist comes out of the dark caves on the forest glade, bathed in bright sunlight. The majority of today’s monitors, TVs and screens of gadgets due to the use of 8-bit encoding of colors and degrees of brightness, there is a relatively narrow dynamic range, which does not allow to correctly display such a scene. If we add to this the ability of our eyes to adapt to the changing light (think of how a few minutes of being in the room where suddenly turned off the light, his eyes gradually beginning to see), the difference between reality and a chance to show it on the screen becomes even more significant.

Proper use of the capabilities of HDR technology allows several times to increase the saturation and detail of the scene, bringing it closer to what we see in real life. That is, when the simultaneous presence in the shot is very dark and bright objects, they should look the same. In order to achieve this, it needs higher values of peak brightness and depth of color than is used typically. Also, in many cases, you will need a map showing the minimum, average and maximum values of brightness throughout the film that will allow you to tailor the content to devices that are not physically able to transmit the maximum brightness level used in this recording. Despite the fact that the classical definition HDR means the ratio between the maximum and minimum luminance, in the context of video content it is also about the wider colour gamut (WCG).

You must understand that the extended dynamic range (HDR) and high maximum brightness is not the same thing. For example, in terms of the dark cinema can achieve high dynamic range even when the standard for many theaters in 48 brightness NIT (CD/m2). At home, you can get a standard dynamic range on very bright screens – hundreds and even thousands of nits. However, the high brightness allows you to work with HDR without having to significantly darken the room for viewing. However, simply increasing the maximum brightness of the display does not automatically expand the dynamic range. Alas, but without the use of additional algorithms such an approach will only lead to more noise.

So, HDR video implies higher depth of colors, peak brightness and color range than the SDR-video.

Let’s move on to color characteristics. Most modern displays use 8-bit color. The color of each pixel is a mixture of shades of the three primary colors of red, green and blue, which are colored in the corresponding subpixels. Any color is encoded using 8 bits, therefore, each of the subpixels can contain one of 256 shades of one main color (2⁸). Ultimately, each pixel can be painted in one of 16.7 million colors (256x256x256). If for the encoding of colors to use 10 bit, we get the 1024 shade of each of the three colors and more than 1.07 billion colors. Transition to 12 bits, in turn, gives access to the 4096 colors (212), and the resulting palette of more than 68.7 billion colors. The increase in the number of bits allows to work with wider colour spaces standards DCI-P3 and BT.2020. The color gamut of BT.2020 recommended as the primary for most HDR formats.

In 2016, the Ultra HD Alliance has published the basic specifications for certification of the display as the HDR-compatible. This peak brightness LCD-the device should be 1000 or more nits (CD/m2), and the black level is 0.05 and less nits. Contrast ratio should be at least 20,000:1. OLED-display indicators respectively 540 and 0.0005 NIT, and the contrast – 1080000:1. It should be noted that the maximum brightness of most monitors on the market today, is 250-350 nits. At the same time, a peak brightness of 1000 nits usually implies the ability of the monitor to maintain that level for a short time and in a small area of the screen. However, even such devices on the market are not very many.

Known today HDR standards based on two core technologies of video coding and perceptual quantization (perceptual quantizer, PQ) and hybrid logarithmic gamma function (Hybrid Log-Gamma, HLG). The technical recommendations of the ITU-R BT.2100, is intended for manufacturers and distributors of HDR content, it is proposed to use the resolution of Full HD 1080p or UHD 4K, the color depth of 10 or 12 bits, the transfer function HLG or PQ, as well as a wide color gamut (WCG) with support for color space BT.2020. In addition, this document describes a reference viewing environment for HDR programs. In particular, the brightness of the environment should be no higher than 5 CD/m2 (NIT) and the screen should not fall in direct light. Thus, to watch HDR content is recommended in a dark room.

For displays that are unable to achieve the required peak brightness or maintain a desired color gamut, to apply the standards of the HDR metadata.

img

PQ10 and PQ

PQ, or perceptual quantization, is one of the main methods of creating HDR-signal developed by Dolby. The technology has been described by the Society of engineers film and television (Society of Motion Picture and Television Engineers, SMPTE) as SMPTE ST 2084: it supports HDR video with a maximum brightness up to 10,000 nits (in theory) and the color space of the BT.2020. This technology is present in all major standards and recommendations related to Ultra-HD devices. Variation PQ10, among other things, imply support for the color depth of 10 bits.

HDR10

The most popular HDR format that is supported by default in almost all modern televisions with HDR capability, and the most common among providers of all types of content. In essence this is a copy PQ10, but with optional static metadata (SMPTE ST 2086). There are two main parameters: the maximum average brightness value of MaxFALL frame (Maximum Frame Average Light Level) and a maximum brightness value of the content MaxCLL (Maximum Light Level Content) that are set once and applied to the entire video stream. At the same time, because the metadata in this case are optional, some producers do not use them at all. It is an open and free technology that, among other things, implemented in the framework of the popular video compression format HEVC.

HDR10+

The main difference from the previous format is the presence of dynamic metadata by which each frame can be equipped with the unique HDR-description. On the basis of these data the device decides whether there is enough of its hardware capabilities to display a particular scene, and if not enough, use tone mapping image, which is using algorithms trying to pull the stage to the desired level. In the case of static metadata (HDR10) even one scene with too broad a dynamic range can reduce the quality of the entire stream, as the scene itself, for example, lasts 10 minutes, and the tone mapping will be applied to the entire stream.

Dynamic metadata allows some scenes to use tone compression, and in the other to do with the possibilities of the technology that in the best way affects the quality of the image. Displays that do not support HDR10+, just ignoring the relevant data and play the video in HDR10. The main parameters HDR10+: the color depth from 10 bits of brightness to 10,000 nits, resolution up to 8K (in theory any). The standard was developed by company Samsung, Panasonic and 20th Century Fox and is also known as SMPTE ST 2094-40. Despite the fact that HDR10+ serves as a free technology, for the right use, you must pay an annual fixed fee ranging from $2500 to $10,000 (depends on product category).

Dolby Vision

This is another version of HDR from Dolby, which is also based on PQ (ST 2084) and supports brightness up to 10,000 nits, resolution up to 8K, and color depth to 12 bit. Devices that support Dolby Vision can decode information from both static (ST 2086) and the dynamic metadata at the original Dolby described in the specification SMTPE ST 2094-10. This is a paid standard royalties in favor of Dolby be about $3 per device. Dolby Vision provides opportunities for content producers. In addition, this format is used in theaters.

Hybrid Log-Gamma (HLG)

The only one of the existing HDR formats, which is not based on PQ, but on HLG private method developed by television companies BBC (UK) and NHK (Japan). Standard HLG is backward compatible with SDR, that is, the same signal will be able to understand how SDR and HDR TVs. However, if the SDR device will cut off part of the signal, which consists of HLG-flow, as an unnecessary, technology-enabled format will be able to decrypt the remaining part and will greatly improve the dynamic range of the picture.

As in the original PQ, metadata are not used. There is only one General requirement for TVs to work with HLG – support color space BT.2020. It is an open and free standard, developed with a view to use in mass broadcasting. At the moment, is not widely used, although supported by some modern TV.

Advanced HDR

In this case we are talking about the family of standards, the development of which involved several large companies, including Philips, STMicroelectronics and Technicolor. Core standard SL series-HDR1 supports static (ST 2086) and dynamic metadata-based formats 2094-20 from ST Philips and ST 2094-30 from Technicolor. This format provides opportunities for working with HDR for content producers. End-user interest is not.

img

Marketing confusion

It would seem, really can confuse even more? It turns out that you can. Quite apart from the above official names used in technical documents, each company applies for at least another few purely marketing terms for the HDR-technology. To understand the nuances of this zoo we will not, however, make sense to highlight the General trend.

Basically under these names denote a signature methods of elaboration and transformation of the SDR content in HDR. In addition, sometimes the manufacturers with the appropriate markings offer the user the choice between primary, secondary and high budget level equipment. In the context of HDR technology this marking often indicates the maximum peak brightness value of the specific monitor or TV, which can be useful. If you look in the details does not want you to focus on logo ULTRA HD PREMIUM approved by the UHD Alliance organization for the certification of high-quality 4K HDR devices.

Insights

Specifications HDR-standards created a substantial reserve for the future. Eyes most people are not able to perceive more than 1 million colors, so even 16.7 million color variations obtained when an 8-bit encoding, and the lack of it abundantly, not to mention the billions in the case of using 10 or 12 bits. Of course, the perception of color palettes individually: someone who sees better than some colors, some others. In addition, some people because of genetic mutations are capable of seeing 100 million colors. But for those other 10-bit color encoding will be more than enough.

As for the theoretical brightness 10,000 NIT, modern televisions are still very far from reaching these figures in practice. Even professional monitors which cost more than tens of thousands of dollars, capable of delivering up to 4000 nits. At the same time most consumer devices run a maximum of 1,000 nits. Mastering content for major HDR formats are also produced on equipment with a peak brightness of 1,000 to 4,000 nits.

Despite the fact that the performance peak brightness is important, HDR-video, you cannot obtain just by sliding the slider to the end position. Many people are surprised when they learn that brightness most modern HDR content is not more than 100-200 NIT and only in a few scenes reaches relatively high values. Extended range means that in one stage can be quite dark areas and very bright objects. For example, in the scene where the main character comes with a bright torch in a dark cave, the brightness of the surrounding space may not exceed 100-150 conditional bat and a torch to Shine on all bat 1000 and even more depending on how to solve a Director in the mastering process. This is the main difference from SDR to HDR: if you increase the brightness to SDR-monitor, the more lit will be the whole cave, and not the individual objects in it.

It is believed that it is better to choose HDR format, support for dynamic metadata, but today if properly calibrated TV to notice the critical difference between HDR10, HDR10+ and Dolby Vision is very difficult. It is connected including with the peculiarities of production of content, when you try to please everyone. Therefore, even HDR10 static metadata will be a good choice, especially because a large part of the HDR content today are presented in this format or compatible with it. If the TV will support one of the formats with dynamic metadata – even better. In any case, the existence of metadata is very desirable, as in most modern films will probably support HDR scene that overlaps the capabilities of your output device, but due to the additional information processing would be easier. As for color, 10 bit is a great compromise. Objective data 12-bit encoding on a date there.

High-quality implementation of HDR will not leave anyone indifferent, but a similar device and the cost will be suitable. However, if the purse allows – it is small.

TVs, Technology

Journal: Journal IT-Expert [No. 11/2020], Subscription to magazines

Go to our cases Get a free quote