Why do you need HDR with Ultra HD?

HDR offers more with Ultra HD - and makes higher demands

The difference between the image resolutions Ultra-HD (4K, 2160p) and Full-HD (1080p) is hardly noticeable to many viewers with a normal sitting distance. But Ultra-HD offers another quality improvement: HDR (High Dynamic Range), an increased contrast and color range in contrast to SDR (Standard Dynamic Range). This technique can be used in addition to the high resolution and ensures vivid images. This has an effect on recordings with backlight or strong differences in brightness. Example: When broadcasting a summer football match on TV, part of the pitch is sometimes in the shade. With SDR, the part of the image in the sun appears either too bright or the part in the shadow too dark. This impression improves significantly with HDR. The colors also look much brighter. While SDR only uses 8 bits per RGB color gradation (i.e. 256 different brightnesses per colour), HDR uses at least 10 bits (i.e. 1024 gradations). This allows stronger differences between light and dark as well as finer transitions between them. In a direct comparison of two pictures it looks like you would take off the sunglasses.

left without / right with HDR (simulation)

With high dynamic range and 10-bit color depth, it is theoretically possible to use the increased color range according to ITU Recommendation 2020 (also called Rec. 2020 or BT.2020). This means that 75% of the RGB standard colors can be displayed. With a conventional 8-bit color space, it was only 35%. In practice, the display still lacks the necessary brightness and at the same time an acceptable black level.

Where is HDR?

The main sources for HDR material are streaming apps such as Netflix, Amazon Video and computer games. Some smartphones and various cameras are also equipped with Ultra HD resolution and HDR, and there are a corresponding number of films on YouTube. More and more Ultra-HD Blu-ray discs are also equipped with HDR. 3D no longer plays a role here.

Not every TV with HDR support offers better picture quality than Ultra HD devices without HDR capability. Because there are still few HDR TV sets that actually use a 10-bit display. The maximum brightness for which HDR was developed and with which the films are mixed cannot yet be displayed by consumer devices. As a result, the television has to downconvert, and this is not always successful. With cheap HDR televisions, for example, the picture can appear very dark because the screen lacks the necessary high brightness.

In the simplest case, you can use an Ultra HD TV with HDR-10 support and a streaming app for HDR. If the feed is from an external source, all components up to the display must support the higher HDR bandwidth. In addition, the copy protection HDCP 2.2 is used with Ultra-HD, which must also be taken into account by the connected devices.

What else is changing with HDR?

Since HDR requires four times the bandwidth than Ultra HD in SDR due to the 10-bit color information, the data rate to be transmitted also increases. Gross data rates of up to 18 GBit/s are possible via the HDMI interface in the current version 2.0. For computer games, which are often output with a refresh rate of 60 Hz, this is already too little for HDR. The amount of data for the output is therefore reduced by recalculating parts of the image information. To do this, the color and brightness information of the image signal is separated and re-encoded in the YUV color format. The image brightness (Y channel) remains unchanged, the color channels are transmitted with half (YUV 4:2:2) or quarter resolution (YUV 4:2:0). Since similar colors cannot be resolved by the human eye as well as different levels of brightness, the reduction goes unnoticed.

For the transmission of films in HDR, an AV receiver, HDMI distributor, HDMI switch must support a data rate of at least 9 GBit/s and HDCP 2.2. An HDR-capable television or projector should have an HDMI 2.0 interface.

SOURCE DEFAULT SAMPLING, FREQUENCY REMARK
Ultra HD Blu-ray Player HDR10, Dolby Vision YUV4:2:0, 24Hz (up to 60Hz) soon also 12-bit
PlayStation 4 Pro HDR10 YUV4:2:2, 60Hz no 4K Blu-ray
Xbox One S (Blu Ray) HDR10, Dolby Vision YUV4:2:0, 24Hz no 4k games
Sky+ Pro satellite receiver HDR10, HLG YUV 4:2:2, 60Hz
Amazon Fire TV 4k (until 2017) no HDR YUV4:2:0, 30Hz
Amazon Fire TV 4K Ultra HD (from 2017) HDR10, Dolby Vision YUV 4:2:2, 60Hz also as a stick
Amazon video app HDR10, Dolby Vision dependent on TV software
Netflix app HDR10, Dolby Vision dependent on TV software
PC + graphics card HDR10 up to 120Hz depending on hardware/software

HDR-10

HDR10 is the most common HDR signal - both in end devices and in films and games. Every HDR-capable display supports HDR-10. It uses a 10-bit resolution for each of the RGB colors. This enables an enlarged color gamut according to ITU recommendation 2020 (also called Rec. 2020 or BT.2020). HDR-10 is an open standard. Support is regulated in HDMI version 2.0b.

Dolby Vision

Dolby Vision is an extended HDR standard that even uses 12-bit technology and dynamically adjusts this spectrum. While with HDR-10 (to put it simply) the brightest and darkest pixels are defined for the entire film, this can be set for each individual image with Dolby Vision. The TV must explicitly support Dolby Vision, a license is required for this. Such TV sets are currently only available from LG. Corresponding films are mainly offered via streaming. There are no offers for computer/console games. Since no HDMI transmission standard has yet been defined for such dynamic data, the additional data can only be hidden in the HDMI metadata so far. Intermediate AV receivers must also be adapted for Dolby Vision, as they do not let the HDMI signal through unprocessed. External devices with Dolby Vision are still very rare. Support is possible for Blu-ray discs, corresponding films are also created in HDR-10. HDR-10 is pressed onto the UltraHD Blu-ray disc in another layer.

HLG

HLG (Hybrid Log Gamma) is another HDR standard used for TV broadcasts. It is used, for example, in the Astra Ultra-HD demo channel via satellite. With HLG, the HDR image is sent backwards compatible with SDR. Advantage: Only a single data stream needs to be transmitted to also supply devices without HDR.

Are new HDMI cables required?

However, there is no new standard for HDMI cables. The cables previously awarded the “HDMI High-Speed ​​with Ethernet” should also do the same for Ultra-HD and HDR. However, the cables have to deal with considerably more data rates than could be tested during their development. With long cable runs or if the quality is not optimal, small white dots in the picture or a black picture can occur here. This can be remedied by higher-quality cables with a larger diameter and better shielding. The use of an HDMI 2.0 repeater, a signal amplifier, can also help with such problems.

Repeater for amplifying long HDMI cable runs


Addendum:

HDR10+

Since 2018 there is another dynamic HDR format called HDR10+ (or also spelled HDR10 Plus). Like Dolby Vision, this uses variable color and brightness settings. An optimal contrast value can therefore be set for each scene during production. Color depth is limited to 10-bit. This HDR format can be updated via software on some HDR10-capable playback devices. HDR10+ was developed as an open standard by Amazon Video, 20th Century Fox, Samsung and Panasonic. There is a list of manufacturers that support HDR10+ here . Newly produced 4K films often support all 3 formats (HDR10, Dolby Vision, HDR10+).