4K HDR videos too dark on non HDR TV

HDR data is designed for backwards compatibility. TV sets’ video processors do no blindly read data and process it. They look into the data stream and extract what they know. What they do not know about they don’t process.

Now with HDR there is “auxiliary data” for the “expanded color”. A TV that can handle HDR will see a flag that HDR content is will arrive and so it will look for it in the data stream and process it. A TV set that cannot handle HDR, well doesn’t know about the flag. For it the bit is “undefined”, hence the “auxiliary data” is not used at all. It never even looks at it. For the non HDR TV that is just noise.

Now the problem comes in: HDR content is differently mastered. It is not like the additional data is just an extension, and the normal data is just like a normal 1080 Bluray. The data is different, the “Core Picture” is usually overall darker (sometimes washed out in general), so the additional bright color information pops later. It has to do with the way color on your TV works, without going into technical details here.

As a result you usually have to pump the backlight way up on HDR displays. That is one thing of HDR: For the additional colors the backlight needs to be high, it is usually almost set automatically to maximum on HDR TVs when entering HDR mode, especially on edge lit TVs.

HDR Bluray Players initially had the same problem you are seeing. Picture was very dark, almost unwatchable.

Part of the reason is, there is no real standard yet for HDR on TVs. Every manufacturer and even within their own lines do it differently. For example the 700 USD range you get way less from HDR as in the 1200 USD range. In fact on the low end I find HDR quite disappointing and even look worse than a good 2K panel often does.

As said there is no HDR standard yet what range a TV is able to displays. The HDR label just says “Hey we can process HDR data on our panel and you get more than 16-235 out of the source”. Think of the HDR label like a “We can do do more colorful pictures”. This is slowly changing. They start to advertise with color standards recently more as consumers starting to understand that a HDR label on it’s own is pretty meaningless.

The problem with too dark picture changed with the 2nd generation of UHD players. Once more HDR discs arrived manufacturers saw the need to implement dithering on their players as HDR TVs basically followed no standard. For example some Sony players’ have 8+ modes to choose from. The 905x (?) of the Vero is more limited here.

Now SDR panels with 10 bit: Well, panels could do more colors already, but there is no information how to use that. Just having a 10 bit SDR panel is absolutely useless. The TV does not know how to make use of the additional colors.

Yes, it has some post processing modes when you enable it, where it “expands” the normal “core colors”. It usually looks very crappy and distorts the colors too much (might work for sports or animation though).

If you having a general too dark picture on a non HDR set when feeding it with data from a source that has no advanced HDR > SDR dithering algorithm your best bet is too play with your TV settings.

Usually you need to pump the backlight way up and there is some “Dynamic Color” setting in the menus where the TV does postprocessing that will help a bit. But colors will be off anyway, though the content is watchable.

Also what might help is to enable/disable/change some of the “Black Levels” option of your TV. This is how it processes 16-235 vs. 0-255 colors range. This might help as well, but can result in a very washed out picture also.

But in general: Don’t play HDR content on a non HDR displays unless you have a source that does more advanced HDR -> SDR conversion. On one LG (forgot the model) they even added HDR as the video processor and the panel where already somewhat capable of it and it only required a firmware update to process the data.

Also you are not loosing much. 4K on itself is pretty pointless given the average viewing distance. You can barely see the difference. People usually think 4K out of the box is way better because of the better display technology of their new 4K TV is better. Or the new TV was bigger than the old one and they see stuff they haven’t seen before on smaller displays. And newer displays were better factory calibrated than their old ones. Usually it has nothing to do with the higher resolution itself. New TV is just better than the old one in general.

But compare top shelf 4K and 2K displays fed with the same non HDR source there is barely a difference. You can see some, usually on very very slow pans. But that is a matter of motion, where especially LED displays are worse than OLED (OLED has other problems though). It’s not a matter of the high resolution, unless you get really close like in a store.

What you want in general is not more pixels on your TV. For that you either have to move closer to the picture or make the TV bigger. Below 65" 4K is pointless. In fact starting with 85" inch you will usually start to see difference. Sure, better pixels in general is always good and that what you get from newer displays. The total resolution itself is not that important.

What people want and need is BETTER pixels NOT MORE. And here 4K with HDR on OLED displays come in. The OLED technology removes some problems of LEDs, but brings a few new ones.

But naturally 4K marketing easy. Higher numbers sell easily, just like MP on cameras. As people saw a difference at their viewing distance at home between SD and HD they now assume it is the same. Nah, it isn’t,

4K on its own is pretty pointless for home, the market only got traction as HDR arrived. But HDR still has issues in general. Just like 10 years ago when HD entered mainstream. There we had similar issues with up- and downscaling, various “black level” modes and so on. Same deal this time, but now it’s about colors. The problems went away basically over the next few years. Same will happen with HDR.

If you non HDR TV isn’t that old - many are not - there is also a chance manufacturers will add stuff with a Firmware update for better processing of HDR data. That means, the TV would start to look at the HDR bit, see it, still does not process the HDR data but will process the “core color data” differently. Both Samsung and LG have done it for a few top shelf lines already, In some even added HDR->SDR dithering. On one set LG even added HDR support as the video processor and the display was already capable of HDR so it only required firmware to process the data (but that is a rare exception).

So also check for a firmware update for your TV, you might be lucky. So play with your TV settings a bit. Your TV should remember settings for 4K resolution independently of 2K resolution.

In general feeding a non HDR displays with HDR data is hit or miss and there is only so much you can try. And don’t worry about missing out something because you are not feeding your 4K TV with non 4K material. At you viewing distance you can barely see it anyway.

Just wait 1-2 years and get a HDR displays when some of the technology trickled down from the top shelf sets to the more affordable ones. If you buy BDs, well get a new player now with HDR->SDR dithering and you can enjoy the 4K. Well, as you using a Vero and probably watch rips, you can store them already for the time when you get a HDR display or simple re-encode to non HDR.

1 Like