4K HDR videos too dark on non HDR TV

I have A LG 4K TV with no HDR, and I recently tried out some 4K HDR videos and they are too dark compared to my normal 1080p videos. I am wondering if this is because my TV doesn’t support HDR.

Is there any way I can play 4K videos and not use HDR so that the brightness is normal like my 1080p videos? I would like to enjoy the 4K quality without the darker brightness.

HDR metadata should just be ignored by your TV.

I’ll see if I can add an option to explicitly disable HDR soon and let you know when it’s available to test. However – when I test on non HDR compatible equipment I’m not noticing this issue; so I’m not sure if it will resolve things for you.

Sam

Thanks for the reply Sam. I got an LG 4K with no HDR, here’s what the normal 1080p BluRay video looks like: Screenshot - fed8081ba2003b7bffb5372dad4816ee - Gyazo

And here is the 4K BluRay version: Screenshot - ac9d9cb7b6cd1ddf0858abf631471a8c - Gyazo

I don’t see how I can change any settings to make the HDR look as bright as the normal 1080p BluRay videos.

Hi @Neymar4ever

Some thoughts (but not a guarantee of improving things). Log in via SSH, and run:

sudo -s
echo 1 > /sys/module/am_vecm/parameters/hdr_mode

This will force HDR -> SDR conversion. When playing a video, hdr_mode should remain as 1.

You can also force colour space conversion type and colour range; where 0 is limited and 1 is full range.

echo 1 > /sys/module/am_vecm/parameters/force_csc_type
echo 1 > /sys/module/am_vecm/parameters/range_control,

There are a lot of saturation options too. Not sure if we need to revisit some of these defaults.
These changes are not persistent across a reboot.

Sam

I’m sorry for the late reply Sam; I haven’t had access to my Kodi box for the last couple of days.

I checked the Wiki now on how to ssh into OSMC. And after testing those command lines, I have to say I’m confused.

The command line echo 1 > /sys/module/am_vecm/parameters/hdr_mode didn’t change anything. But when I tried adding 0 instead, as in echo 0 > /sys/module/am_vecm/parameters/hdr_mode then the picture got brighter. But now it seems the colors are washed out, its not as good as the 1080p files.

After entering those lines only, the 1080p files seem to have the same colors as before, so that’s good, nothing changed there. But the 4K files as I said look a little washed out when I entered echo 0. It’s almost perfect but not quite- With echo 1 its dark like previously.

The other two command lines didn’t change anything when I entered them after entering the hdr command line.

So these command lines didn’t change anything:
echo 1 > /sys/module/am_vecm/parameters/force_csc_type
echo 1 > /sys/module/am_vecm/parameters/range_control

I took a few more pictures now with the new command lines enabled.

New HDR command line enabled: Screenshot - f5a52f9e31c2dbb24eae9ab7f42ff6cc - Gyazo
Normal 1080p: Screenshot - ff5bdd81c078a1e70b71815de0f6013d - Gyazo

New HDR command line enabled: Screenshot - 003496ea642de33828dc41d6713d2045 - Gyazo
Normal 1080p: Screenshot - 26baf17bfddd694e8e9df4db11e8e434 - Gyazo

I actually like the 4K picture of the first two pictures, but as you can see in the second example the HDR one looks washed out.

Bump.

Hi,

Sorry for missing this.

There’s no official specification for dithering HDR (BT2020) to SDR (Rec 709).

So most devices offer a few ‘curves’ (effectively what we have now) which allow you to cycle through some presets.

Make sure your display doesn’t have different contrast / brightness settings at different resolutions. My TV does this, which has caused confusion before. The easiest way to test is to disable Adjust Refresh Rate so that the 4K clip is played at 1080p.

Sam

Thanks for replying Sam.

I tested on a LG 4K non HDR tv and on a SOny 4K non HDR tv. Both have the same issue i listed above.

I’m not sure what you mean by disabling the refresh rate. How would that make my 4k video into a 1080P? Since my TV is 4K. I can do a test later and see if it does help with the color issue, but I really need that option to be turned on or else the blu ray videos look bad since they will play in 60fps when the video itself is 24fps. I’m sure you know what I mean by that. Playing a 24fps file on a 60hz tv looks bad. The adjust refresh rate option makes the video play smooth.

I tested with disabling Adjust Refresh Rate and it’s still the same issue. Color look brighter but washed out when i type in “echo 0 > /sys/module/am_vecm/parameters/hdr_mode”

I also tested turning on and off a setting called hdmi ultra hd deep color on my lg tv. Didn’t make a difference.

Is it not strange why the command line “echo 1 > /sys/module/am_vecm/parameters/hdr_mode” doesn’t change anything for me, but when I replace 1 with 0, as in “echo 0 > /sys/module/am_vecm/parameters/hdr_mode” then my video gets brighter? Of course this also makes the colors washed out.

I think there’s been some misunderstanding here.

This was simply for testing purposes. It was in no way suggested as a permanent solution. Some displays will keep different brightness and contrast settings for different video modes. I wanted to verify this wasn’t the case.

Not particularly.

When HDR is activated, Vero 4K will send HDR metadata to your display. If your display doesn’t handle HDR, it’s ignored, which is why you see darker colours. Your display isn’t processing the information Vero 4K is sending.

When you disable HDR output, Vero 4K will try and do its best to convert BT2020 to Rec 709 accordingly. Unfortunately, as mentioned, this can be hit and miss. You also need to keep in mind that some HDR content is not quite on the mark in its recording. Is everything washed out, or only some HDR films? Is the level of ‘wash out’ consistent?

We’re improving this, and while there’s no official specification on how to convert, we’re keeping an eye on other brands like Panasonic and LG. An AML guy yesterday says:

There is no real guideline for this so we are doing our best but it will improve long term
Watch 4K HDR on an HDR display if possibl otherwise send HDR signal and ignore metadata or choose preset
Not much can be improved beyond that but will be a problem for other devices too so either this improves things for everyone or fixes expections.

All a bit of a mess.

Sam

That sums it up pretty nicely, I think. Not even talking about different HDR formats here :expressionless::see_no_evil:

Just a question from my side: If setting HDR output to enabled with a SDR TV connected, would there always be a HDR flag that the TV would then ignore? So, wouldn’t it be the better option to always leave HDR output on and let the equipment then show either HDR or SDR while ignoring the HDR flag (instead of dithering)?

And that leads me to the question: Is there any plan to have two options for us users in the settings menu: an option to enable/disable HDR output manually (and maybe an option to tweak dithering with some profiles) and an option to manually enable/disable 10-bit output (overriding any related EDID information with those two options) - the second option could be useful for those who have a SDR panel that supports 10-bit input?

HDR data is designed for backwards compatibility. TV sets’ video processors do no blindly read data and process it. They look into the data stream and extract what they know. What they do not know about they don’t process.

Now with HDR there is “auxiliary data” for the “expanded color”. A TV that can handle HDR will see a flag that HDR content is will arrive and so it will look for it in the data stream and process it. A TV set that cannot handle HDR, well doesn’t know about the flag. For it the bit is “undefined”, hence the “auxiliary data” is not used at all. It never even looks at it. For the non HDR TV that is just noise.

Now the problem comes in: HDR content is differently mastered. It is not like the additional data is just an extension, and the normal data is just like a normal 1080 Bluray. The data is different, the “Core Picture” is usually overall darker (sometimes washed out in general), so the additional bright color information pops later. It has to do with the way color on your TV works, without going into technical details here.

As a result you usually have to pump the backlight way up on HDR displays. That is one thing of HDR: For the additional colors the backlight needs to be high, it is usually almost set automatically to maximum on HDR TVs when entering HDR mode, especially on edge lit TVs.

HDR Bluray Players initially had the same problem you are seeing. Picture was very dark, almost unwatchable.

Part of the reason is, there is no real standard yet for HDR on TVs. Every manufacturer and even within their own lines do it differently. For example the 700 USD range you get way less from HDR as in the 1200 USD range. In fact on the low end I find HDR quite disappointing and even look worse than a good 2K panel often does.

As said there is no HDR standard yet what range a TV is able to displays. The HDR label just says “Hey we can process HDR data on our panel and you get more than 16-235 out of the source”. Think of the HDR label like a “We can do do more colorful pictures”. This is slowly changing. They start to advertise with color standards recently more as consumers starting to understand that a HDR label on it’s own is pretty meaningless.

The problem with too dark picture changed with the 2nd generation of UHD players. Once more HDR discs arrived manufacturers saw the need to implement dithering on their players as HDR TVs basically followed no standard. For example some Sony players’ have 8+ modes to choose from. The 905x (?) of the Vero is more limited here.

Now SDR panels with 10 bit: Well, panels could do more colors already, but there is no information how to use that. Just having a 10 bit SDR panel is absolutely useless. The TV does not know how to make use of the additional colors.

Yes, it has some post processing modes when you enable it, where it “expands” the normal “core colors”. It usually looks very crappy and distorts the colors too much (might work for sports or animation though).

If you having a general too dark picture on a non HDR set when feeding it with data from a source that has no advanced HDR > SDR dithering algorithm your best bet is too play with your TV settings.

Usually you need to pump the backlight way up and there is some “Dynamic Color” setting in the menus where the TV does postprocessing that will help a bit. But colors will be off anyway, though the content is watchable.

Also what might help is to enable/disable/change some of the “Black Levels” option of your TV. This is how it processes 16-235 vs. 0-255 colors range. This might help as well, but can result in a very washed out picture also.

But in general: Don’t play HDR content on a non HDR displays unless you have a source that does more advanced HDR -> SDR conversion. On one LG (forgot the model) they even added HDR as the video processor and the panel where already somewhat capable of it and it only required a firmware update to process the data.

Also you are not loosing much. 4K on itself is pretty pointless given the average viewing distance. You can barely see the difference. People usually think 4K out of the box is way better because of the better display technology of their new 4K TV is better. Or the new TV was bigger than the old one and they see stuff they haven’t seen before on smaller displays. And newer displays were better factory calibrated than their old ones. Usually it has nothing to do with the higher resolution itself. New TV is just better than the old one in general.

But compare top shelf 4K and 2K displays fed with the same non HDR source there is barely a difference. You can see some, usually on very very slow pans. But that is a matter of motion, where especially LED displays are worse than OLED (OLED has other problems though). It’s not a matter of the high resolution, unless you get really close like in a store.

What you want in general is not more pixels on your TV. For that you either have to move closer to the picture or make the TV bigger. Below 65" 4K is pointless. In fact starting with 85" inch you will usually start to see difference. Sure, better pixels in general is always good and that what you get from newer displays. The total resolution itself is not that important.

What people want and need is BETTER pixels NOT MORE. And here 4K with HDR on OLED displays come in. The OLED technology removes some problems of LEDs, but brings a few new ones.

But naturally 4K marketing easy. Higher numbers sell easily, just like MP on cameras. As people saw a difference at their viewing distance at home between SD and HD they now assume it is the same. Nah, it isn’t,

4K on its own is pretty pointless for home, the market only got traction as HDR arrived. But HDR still has issues in general. Just like 10 years ago when HD entered mainstream. There we had similar issues with up- and downscaling, various “black level” modes and so on. Same deal this time, but now it’s about colors. The problems went away basically over the next few years. Same will happen with HDR.

If you non HDR TV isn’t that old - many are not - there is also a chance manufacturers will add stuff with a Firmware update for better processing of HDR data. That means, the TV would start to look at the HDR bit, see it, still does not process the HDR data but will process the “core color data” differently. Both Samsung and LG have done it for a few top shelf lines already, In some even added HDR->SDR dithering. On one set LG even added HDR support as the video processor and the display was already capable of HDR so it only required firmware to process the data (but that is a rare exception).

So also check for a firmware update for your TV, you might be lucky. So play with your TV settings a bit. Your TV should remember settings for 4K resolution independently of 2K resolution.

In general feeding a non HDR displays with HDR data is hit or miss and there is only so much you can try. And don’t worry about missing out something because you are not feeding your 4K TV with non 4K material. At you viewing distance you can barely see it anyway.

Just wait 1-2 years and get a HDR displays when some of the technology trickled down from the top shelf sets to the more affordable ones. If you buy BDs, well get a new player now with HDR->SDR dithering and you can enjoy the 4K. Well, as you using a Vero and probably watch rips, you can store them already for the time when you get a HDR display or simple re-encode to non HDR.

1 Like

Wow :see_no_evil::joy::+1:t2:

So, different profiles for dithering won’t be possible on the Vero 4k?
And switching 10-bit support on or off won’t be usefull, either?

Turning 10 bit support should not really help. Though always worth a shot trying various settings and combinations to mitigate a problem. Never hurts trying.

As said the “core component” of the picture on the UHD HDR disc is not identical to a BD. Depends a lot of the disc though. And I think you had the wrong impression here that 4K HDR discs are just like normal discs only with more colors if your TV supports it. You know similar to DTS-HD, where you can just use a Core component.

There’s no magic switch, you can just flip.

You cannot add new algorithms to the hardware. You can only leverage what the hardware offers and there might be enhancements in software to leverage that. I am not that familiar what the 905x can do here exactly. Also it is not like there is a SOC (System on a Chip) available that has such stuff implemented.

Over time it will become standard, usually when no one cares anymore. As usual.

As said, experiment a bit with the settings of your TV and you might get something decent out of the HDR content. But you are probably better of, remuxing the 4K HDR content and then encoding it with the normal color range.

Just play a bit with the settings on your set and see if you can get a decent picture out of 4K HDR. Neither the Vero 4K nor OSMC are to blame here. In fact OSMC’s HDR playback improved a lot. That stuff is also quite new and who knows there might be a workaround for your LG set. Sam will do his best, that I am sure of.

This I’m asking as there are more and more 1080p 10-bit HEVC rips turning up that I might be able to play with 10-bit being enabled. It’s slightly off topic here, but related to newer, alternative picture modes that weren’t around before…

Would’ve been too awesome, if it were true.

With sound formats the industry at least got a lot more right than they did with 4K/HDR… That’s my impression. Backwards compatibility would’ve been nice, but well… What dou you expect? :see_no_evil::joy:

That’s what I mean… What can the 905x do? Is there any more than the dithering currently implemented (like maybe other profiles e.g.), @sam_nazarko?

When it comes to 1080p HEVC is pointless. In fact encodes are more blurry. At 1080p x265 is subpar to a decent x264. Well, except for Animee mostly. Will get better over time, but same as with x264, the HEVC x265 implementation needs a couple of more years before it will become the new standard for new private encodings (no point to re-encode again this time). At this time we all talking 4K HDR anyway.

At 4K though it’s different as there the the bitrate is usually high enough so that HEVC plays to some of its strength - but that is compared to the source material.

For 4K, HEVC remuxes will probably be the standard as there is not much point it re-encoding the discs. A 4K UHD HDR remux is about 4 times the size of a decent 1080p x264 encode (e.g. 40-50GB compared to 10-15GB). Basically 4K HDR picture at the size of a AVC BD.

At 4K x264 stinks - and of course for HDR, as there is nearly no hardware that can handle 10bit AVC. So HEVC it is.

For 1080p if you re-encode a BD with x265 so it is smaller than an x264 encode, it is basically the same as using a higher CRF value and the picture looks exactly like that. The picture gets way too blurry to fast with x265 on HD sources.

But that is not a failure of HEVC. The x265 implementation isn’t that good and the improved algorithms only play their strength at way higher resolutions and bitrates. But again at that time 4K HDR will be the new 1080p. So who cares…

Note on Audio:

When it comes to audio, the history there is different. In theatrical projections there is a long history of sound processors taking the information encoded on the reel and bring it to the speakers. That evolved when everything went digital. Naturally with film that was not possible to have different color channels and process them (aka mix colors by a processor during projection, that’s not how it works).

But there were also draw backs over the years, like you need to have 7.1 or whatever number of speakers and still the sound was not coming from everywhere. And mixing engineers had to work with that. Anything beyond 5.1 at home, unless it is a larger room was quite pointless. In fact 3.1 is usually enough with a big ass center (most important channel anyway) and a decent sub for the punch. The surrounds speakers or back center never really gave me anything as they are so sparingly used and so depended on the mix and you rarely have the room for a proper setup.

With Dolby Atmos that changed finally. Now sound editors are capable to say “this sound comes from the top left, goes to the bottom left and then to the front”. This is then defined mathematically and the sound processors maps this sound to the available speakers to generate this effect. Think like a “language” to describe panning.

128 tracks (well technically 7.1.2 and 118 tracks for objects) and spatial audio description are a huge advancement.

Though for home not as good as in theaters naturally as home theaters lacks bandwidth (shitty HDMI designed by AV engineers and not by network engineers) and processing power.

Here a spatial substream is added to TrueHD or DD+. Again backwards compatible naturally. This substream is kind like a pre-rendered mix. But it is NOT matrix encoded but spatially encoded including panning metadata. Up to 24.1.10 possible at home and speakers form clusters. Best thing in a long time at home when it comes to audio - also production tools already mix stuff for home usage based on the theatrical mix, so separate sound engineering required.

Recently listened to a setup at a local AV retailer. I was impressed, and I haven’t been impressed sine the days I bought my first decent 5.1 set.

DTS:X is similar and basically adds Auro 3D on top of Atmos (meaning sound layering), though less flexible in the speaker setup.

Both do not really play a role at home in the end. The point is mostly for home use the spatial encoding and panning that is mapped to the available speakers. Also better volume control over individual objects here as well (stuff that always lacked before in a matrix encoding).

So personally I am waiting a bit and then a big upgrade is due:

  • 65" 4K HDR TV - top shelf model (maybe 55" the HDR is what I am interested in).
  • New AV Receiver that acts as a preamp mostly - I route music through dedicated stereo amps.
  • Set of new Speakers, probably just a bunch of small ones adding to my current ones so I can take some advantage of Atmos. That means ceiling speakers and all the stuff, nicely hidden with full wife compatibility.
  • …and if available get a new Vero for sure.

Can’t stress it enough: Regardless of some criticism, best box ever and the a premium price for the support is worth every penny as I can concentrate on other things in my Kodi setup instead of getting basic features to work. Biggest drawback though is still Kodi itself and also how it renders stuff. But well, only thing that fits on a small box. I used big ass HTPCs also in the past, these days it is more about storage for me anyway. Yes, not the perfect picture and some dedicated player will have a better result. But meh, swapping discs and stuff. Nah, I am over that since almost a decade.

Anyway, sorry for so much off topic talk. Hope you get find a way to get a decent picture out of your non HDR TV with HDR content.

It depends on the age of the encode. Early encodes were not very good.
However the same can be said for x264. If you look at the progress from a 2007 encode (macroblocking like hell) compared to even a 2012 release, it is really astounding. Things are always improving and it’s always good to stay on top of support for new formats.

The AMLogic S905x SoC features the AMLogic Video Enhancement Engine. This is responsible for HDR processing and it’s why you can get HDR output as well as dithering.

You have microcode updates and kernel updates which can bring a great deal of improvements. There don’t seem to be any defects in S905x sillicon; but there were some for older S905 chips which found their way in to other devices.

Dithering from HDR to SDR is effectively done by using a LUT.

In the next update, three commits will be included which are relevant:

This will allow you to select between 4 LUTs using:

echo X > /sys/module/am_vecm/parameters/video_lut_swtich
Value X can be 1 to 4.

It’s probably going to take a bit of time to improve this. We won’t get it “right” because it is not an exact science, but by offering a few options we should be able to satisfy most users in time. This is the approach that other vendors are taking.

Sam

2 Likes

Thanks for the info, Sam. Nice to see ongoing improvement and leveraging what the SOC has to offer. Some different LUTs might help a few people that insist on playing HDR on an SDR set.

Do you have a link to some documentation in regards to AML’s VEE? Just curious here.

On the note of HEVC/AVC. Yeah, 10 years ago x264 was horrible for 1080p content. I started in 2015 to en-encode all my DVDs (with a lot of postprocessing) and Bluray (just re-encodes, sometimes with postprocessing if the master has issues or some simple regrading). Both were stored as Remuxes before.

Last time I played with x265 was in late February. It is still not there. All’s I’m saying. But sure it will be there in the future. But for 1080p it’s still a no go from my personal tests and comparisons in quality/size. But yeah, maybe in a year or two. But no point in re-encoding HD stuff anyway then, just for new stuff. And for 4K/HDR it’s a no brainer anyway - but as said here remuxes from Discs are the way to go in the long run here anyway. Time will tell.

Also I noticed that Amazon and Netflix screw up SDR more and more. You see ton of really harsh tonemapping in dark scenes that have bright highlights. Looks to me like the filmmakers mastered it in HDR and then didn’t bother to master in SDR for streaming but instead just ran it through a down converter. Getting more and more obvious. “Star Trek Discovery” is really bad here.

One reason why I plan to upgrade to HDR personally. The current ongoing downconverting is really bad. Thankfully most movies are still mastered for HD SDR or are available in a proper SDR master. But guess this will die out soon as well and what is sold in HD will just be downconversions.

What I really hate is that UHD BDs do not contain the SDR master as well when it exists. And instead the industry goes for converting for SDR displays, which always will create some inconsistencies. In general, if color and luminance is important to you, UHD BDs on SDR displays should be avoided in general. You know general overblown highlights. And as pointed out 4K vs 2K at the average viewing distance play not much of a role anyway.

Also here we can argue about some HDR remastered done by an engineer vs. the filmmakers vision when it comes to older movies. Can be hit or miss and from the maybe 25 HDRs I saw it was usually a miss when it comes to older stuff. Though I was impressed with “The Martian” on LG’s newest top of the line set. That one really turned out nice compared to old movies like “Good Fellas”, “Apollo 13” and “Unforgiven” where at best you take the Atmos track and mux it to a 2K BD. But some people do not care about color as long as it “pops” regardless how wrong it might look…