Directly from the specialist – high-quality technology so that you get the best solution.

German family business – Personal service for your complete satisfaction

Fast shipping - Germany-wide within 1-3 days

Why do you need HDR with Ultra HD?

Wozu braucht man HDR bei Ultra-HD? - FeinTech

Henrik Ailland |

HDR offers more in Ultra HD – and places higher demands

The difference between Ultra HD (4K, 2160p) and Full HD (1080p) is barely noticeable for many viewers at a normal seating distance. But Ultra HD brings an additional increase in quality: HDR (High Dynamic Range). This technology enables an extended contrast and color range compared to SDR (Standard Dynamic Range) and ensures more vivid images. HDR shows its advantages particularly in shots with backlighting or strong differences in brightness. For example: When broadcasting a summer soccer game, part of the pitch may be in the shade. While with SDR the sunny areas appear overexposed or the shady areas too dark, HDR ensures a balanced display. The colors also appear more vivid. While SDR only uses 8 bits per RGB color gradation (i.e. 256 brightness levels per color), HDR uses at least 10 bits (i.e. 1024 gradations). This enables greater differences between light and dark as well as finer transitions. In direct comparison, it looks like you are taking off sunglasses.

left without / right with HDR (simulation)

With high dynamic range and 10-bit color depth, it is theoretically possible to use the enlarged color gamut according to ITU Recommendation 2020 (also called Rec. 2020 or BT.2020). This means that 75% of the RGB standard colors can be displayed. With the conventional color space with 8 bits, it was only 35%. In practice, the display still lacks the necessary brightness for this while at the same time providing an acceptable black level.

Where can I find HDR?

The main sources of HDR material are streaming apps such as Netflix, Amazon Video and computer games. Some smartphones and various cameras are also equipped with Ultra HD resolution and HDR, and there are correspondingly many films on YouTube. An increasing number of Ultra HD Blu-ray discs are also equipped with HDR. 3D no longer plays a role here.

Not every TV with HDR support offers better picture quality than Ultra HD devices without HDR capability. There are still very few HDR TVs that actually use a 10-bit display. The maximum brightness for which HDR was developed and with which the films are mixed cannot be achieved by most consumer devices. As a result, the TV cannot always convert the image optimally, which can lead to the image appearing too dark, especially with cheaper models, because the necessary brightness is lacking.

In the simplest case, you can use an Ultra HD TV with HDR support and a streaming app for HDR. If the input is from an external source, all components up to the display must support the required HDR bandwidth. In addition, Ultra HD uses the HDCP 2.2 copy protection, which must also be taken into account by the connected devices.

What else changes with HDR?

Since HDR requires several times the bandwidth of Ultra HD in SDR due to the greater color information, the data rate to be transmitted also increases. Gross data rates of up to 18 GBit/s are possible via the HDMI interface in the current version 2.0. This is already too little for HDR for 4k computer games, which are often output at a refresh rate of 60 Hz. Therefore, the amount of data for output is reduced by recalculating parts of the image information. To do this, the color and brightness information of the image signal is separated and re-encoded into the YUV color format. The image brightness (Y channel) remains unchanged, the color channels are transmitted at half (YUV 4:2:2) or quarter resolution (YUV 4:2:0). Since similar colors cannot be resolved as well by the human eye as different brightnesses, the reduction goes unnoticed.

To transmit HDR films, you must ensure that your AV receiver, HDMI distributor or HDMI switch supports a data rate of at least 9 GBit/s - preferably 18 GBit/s - and HDCP 2.2. An HDR-compatible television or projector should have an HDMI 2.0 interface.

SOURCE STANDARD SAMPLING, FREQUENCY REMARK
Ultra-HD Blu-ray player HDR10, Dolby Vision YUV4:2:0, 24 Hz (up to 60 Hz)
PlayStation 4 Pro HDR10 YUV4:2:2, 60 Hz no 4K Blu-ray
PlayStation 5 HDR10 RGB/YUV444 120Hz
Xbox One S (Blu-Ray) HDR10, Dolby Vision YUV4:2:0, 24 Hz no 4K games
Xbox Series X HDR10, Dolby Vision RGB/YUV444 120Hz
Sky+ Pro satellite receiver HDR10, HLG YUV 4:2:2, 60 Hz
Amazon Fire TV 4k (until 2017) no HDR YUV4:2:0, 30 Hz
Amazon Fire TV 4K Ultra HD (from 2017) HDR10, Dolby Vision YUV 4:2:2, 60 Hz also as a stick
Amazon Video app HDR10, Dolby Vision depending on TV software
Netflix app HDR10, Dolby Vision depending on TV software
PC + graphics card HDR10 up to 120 Hz depending on hardware/software

HDR-10

HDR10 is the most common HDR signal - both in end devices and in films and games. Every HDR-capable display supports HDR-10. It uses a resolution of 10 bits for each of the RGB colors. This enables an increased color gamut according to ITU Recommendation 2020 (also called Rec. 2020 or BT.2020). HDR-10 is an open standard. Support is regulated in HDMI version 2.0b.

Dolby Vision

Dolby Vision is an extended HDR standard that even uses 12-bit technology and dynamically adjusts this spectrum. While HDR-10 (to put it simply) sets the brightest and darkest pixels for the entire film, Dolby Vision allows this to be set for each individual image. The television must explicitly support Dolby Vision, for which a license is required. Not all manufacturers currently offer such TV sets. Corresponding films are mainly offered via streaming. There are hardly any offers for computer games, only the Xbox Series X supports Dolby Vision Gaming. AV receivers connected in between must also be adapted for Dolby Vision, as they do not let the HDMI signal through unprocessed. Dolby Vision support is possible for Blu-ray discs, and corresponding films are also created in HDR-10. HDR-10 is pressed onto the UltraHD Blu-ray disc in an additional layer.

HLG

HLG (Hybrid Log Gamma) is another HDR standard used for TV broadcasts. It is used, for example, on the Astra Ultra-HD demo channel via satellite. With HLG, the HDR image is transmitted in a downwards compatible manner to SDR. The advantage: only a single data stream needs to be transmitted to supply devices without HDR.

HDR10+

Since 2018, there has been another dynamic HDR format called HDR10+ (or HDR10 Plus). Like Dolby Vision, this uses variable color and brightness settings. An optimal contrast value can be set for each scene during production. The color depth is limited to 10 bits. This HDR format can be updated using software on some HDR10-capable playback devices. HDR10+ was developed as an open standard by Amazon Video, 20th Century Fox, Samsung and Panasonic. A list of manufacturers that support HDR10+ can be found here. here . Almost all 4K films that support HDR10+ are also offered in Dolby Vision and HDR10. Conversely, there are many films in Dolby Vision with no alternative in HDR10+.

Are new HDMI cables required?

There is no new standard for HDMI cables. The cables that have previously been awarded "HDMI High-Speed ​​with Ethernet" should also be able to handle Ultra HD and HDR. However, the cables have to deal with a considerably higher data rate than could be tested during their development. If the cable runs are long or the quality is not optimal, small white dots in the image or a black screen can appear. Higher quality cables with a larger diameter and better shielding can help. The use of an HDMI repeater , a signal amplifier, can also help with such problems.

HDMI fiber optic hybrid cables are ideal for longer distances and video resolutions of 4K HDR and higher. The signal is transmitted via optical fibers with very low losses. This means distances of up to 100 m are possible.

FeinTech

Technology meets passion