You won't get real HDR on a cheap TV - here's why

You won't get real HDR on a cheap TV - here's why
Every few years, new TV technology will be introduced, introducing new buzzwords for TV buyers to become familiar with. Over the past year or so, we've seen VRR and HLG enter today's commercial TV lexicon, which have their own guides you can dive into. In this article, however, we take a step back to talk about HDR, or high dynamic range, a TV technology that's been around for several years, and almost all TVs (except the cheapest ones) claim to support it. Let's take a look at why HDR isn't so easy, and why your current budget TV might not deliver the great HDR picture you bought it for.

The basics of backlight control

The main goal of HDR content is to increase the dynamic range of the image on the screen. This happens in two key areas: shadow detail and specular highlight detail. This means more image information in the darker and lighter areas of an image: darker stones in dungeons and more realistic, stunning sunsets, perhaps even in the same frame. In an ideal scenario, a movie is shot with HDR in mind, although many HDR movies are edited from a non-HDR source. To get a great HDR picture, a TV must have the ability to increase the gap between the brightest and darkest point on the screen. This is HDR's playground. Without this boost, however, using HDR can feel like it flattens the image, making it worse than before. That's why local dimming is a staple of any really awesome HDR TV, and usually missing from any budget TV. Affordable LCD TVs are illuminated by universal backlighting, with an array of LEDs that illuminate the edge of the screen in a matrix that spreads it across the entire screen. Therefore, any dimming that occurs in dark scenes in a movie affects the entire screen.

HDR

SDR vs HDR (Image credit: Dolby) Imagine a scene where half of the image is bright, for example a cloudy sky with the sun, with the other half in the dark, where the protagonist is hiding behind a rock, for example. The brightness required by bright areas means dark areas can only be so dark. LCD screens use a layer of polarizers to block unwanted backlight light, but some will continue to leak through, and that's what bursts your dynamic range bubble. That's why you want local dimming, and there are plenty of ways, as Samsung's QLED lineup shows. Samsung Q70T and below TVs don't feature local dimming, which may be surprising given that the Q70T isn't exactly a cheap TV, coming in at $999 / £799 for a 55-inch size. You need to go for Samsung Q80T and above to get full LED backlight. This divides the backlight into areas of LEDs that can be turned off, greatly improving image contrast in scenes where there are bright and dark spots. The 'compare' tool on Samsung's website suggests that all TVs from the mid-range Q80T to the top-end Q950TS match evenly in this area, with 'Direct Full Array' backlight control. But there is more. Larger and more expensive televisions tend to include more of these backlight areas. Samsung's 80-inch Q55T has 50 local dimming zones, while the 950-inch Q75TS has 488. The more compact and more dimming areas, the better, because large backlight areas mean you get ``effects. `halo'' of light that distracts (and reduces contrast) around bright objects. OLED TVs are masters of local dimming because each pixel is its own light source. The 4-inch LG CX 55K OLED effectively has 8.294.400 zones, although OLEDs don't use this ``zone'' terminology.

Fundamental limitations of cheap TVs

Toshiba 4K TV

Toshiba UA2B 4K TV (Image credit: Toshiba) So are all the models below the Q80T in the Samsung lineup poor for HDR because they don't have local dimming? Not too much. While there are technical limitations for a TV like the Samsung Q70T, its true HDR experience is supported by an excellent native ANSI contrast ratio of around 7000:1 and a decent peak brightness of around 600 nits. Overall contrast without local dimming is better than some TVs with local dimming. Buy a really cheap TV though, and you won't get deep blacks as much as high peak brightness. Let's take a closer look at one of the best budget TVs as an example: the Samsung TU7100. We'd recommend this set to just about anyone looking for a low-cost TV. It offers great contrast and rich images of the kind you don't often see at this price. However, with a maximum low light of around 300 nits, it just doesn't have the power to come close to the oomph of a true HDR experience. The Samsung TU7100 achieves this high contrast thanks to its VA panel. Many alternative TVs for the price use IPS LCD instead, and here you tend to see the opposite problem. While these panels can often scratch what you might call the basic level of brightness required for HDR, when the backlight is at its peak for scenes with ultra-bright sections, darker areas will appear somewhat washed out. You end up with an "HDR" image that has relatively low dynamic range, which should be a contradiction in terms. This shows how HDR as a concept has been watered down to bring it to the mass market. We see the same effect on computer screens. Lower-end models may have the VESA HDR400 seal of approval, but that means they can hit up to 400 nits of brightness. That alone isn't enough for a legitimate HDR experience.

HDR10, HDR10 + and Dolby Vision

Dolby Vision HDR

Dolby Vision HDR (Image credit: Dolby) If you're looking for a budget TV, you can also skip the HDR standards posted on a manufacturer's website. If your TV can't significantly display the base HDR10 format, there won't be a noticeable visual difference between it and its dynamic counterparts, HDR10+ and Dolby Vision. It's only on high-end TVs that you'll really enjoy it. HDR10 has become the reference HDR standard. This is your HDR meat and potatoes. HDR10+ and Dolby Vision are rivals, both of which are more advanced than HDR10 because they use dynamic metadata. This provides the TV's processor with information about the brightness of each scene so it can more rigorously optimize your picture. HDR10 videos use a set of metadata for each movie/TV episode. Dynamic metadata often leads to slightly better specular highlights and shadow detail in Dolby Vision and HDR10+. There are a few other differences as well, but most head-to-head comparisons highlight that Dolby Vision encodings tend to be a bit darker than HDR10+.

And the color

HDR fans may have noticed that we haven't mentioned color yet. Yes, that's an important part of HDR, and it's another area where your results will be heavily influenced by your TV's capabilities and calibration. A good standard dynamic range stream is 8-bit color coded. This includes HD streaming from Netflix and standard definition Blu-ray and DVD. HDR10 and HDR10+ use 10-bit colour, while Dolby Vision uses 12-bit colour. This bit depth determines the number of gradations possible between color tones, but not specifically what you might mean when you casually say "color depth". People generally refer to the bolder richness of tones by this term. This color element is determined by the color space or "gamut". Standard dynamic range broadcasts use Rec.709, a color space extremely similar to sRGB. This is the traditional standard for computer screens on printers.

best netflix movies in uk - spider-man in spider-verse

Spider-Man: Into the Spider-Verse (Netflix) (Image credit: Netflix) HDR formats use Rec. 2020. This incorporates a wide range of color tones that are deeper than those of Rec. 709, especially in the green and red wavelengths. Even our favorite TVs don't come close to filling that color gamut. The flagship LG CX OLED drives around 70-76% of Rec. 2020, and a cheap TV can hardly compete. A TV we mentioned above that offers decent picture quality but poor brightness for HDR, the Samsung TU7000 only targets full sRGB/Rec.709 coverage. Rec. 2020 doesn't even fit the picture. Although it may support HDR, the Samsung TU7000 cannot display the additional color tones unlocked by HDR. This can lead to poor results if a TV isn't particularly adept at handling HDR colors outside of what it can actually convey. Some sets feature HDR color clipping, where you'll see a flat liner of the color tone at its most saturated point. This can be seen in the image of a red flower, for example. The subtle textures of natural colors become flat patches in a shade of red. It's similar to white dodge in photography, but with color.

Headroom to new heights

The best HDR formats even display the world's best consumer TVs, like the Panasonic HZ2000, LG CX OLED, and Samsung Q95T. But this is largely because these formats were created not only for today's televisions, but also for those of the future. If we split HDR10+ and Dolby Vision into their basic differences, we get the following (see sidebar on the right). HDR10+ Dynamic HDR Formats
10 bit color
Mastering up to 4000 nit Dolby Vision
12 bit color
Brightness control up to 10,000 nits The LG CX OLED manages around 700 to 750 nits of maximum brightness, and that's only when a small part of the screen is on. This drops below 150 nits when viewing a full screen of white. It supports 10-bit color, not 12-bit. Samsung's Q950TS offers much higher brightness, at 2000 nits in dynamic mode and 1300 using the film preset. But your display panel isn't even "real" 10-bit. It's an 8-bit panel that uses a trick called Frame Rate Control, or FRC, to bring it closer to 10-bit performance. This uses the display bias layer to quickly switch between states and emulate missing tonal gradations. What does this tell us? While home theater fans are often annoyed with "fake" HDR in a variety of flavors, the reality isn't so clear cut. And the HDR we revere today may be considered fake HDR in just a few years. For a great HDR experience, look for TVs that support HDR10+ or ​​Dolby Vision, plus an OLED panel or LCD display with good local dimming. Everything else will disappoint you. HDR TVs capable of doing high dynamic range justice