4k Metadata

Hi. I use my Vero 4k with a lumagen processor, which has the ability to adjust the settings based the NITS level the source is mastered at. i.e. it will adjust the settings for a 1000 NITS mastered movie vs 2000 vs 10000. Can anyone advise if this meta data is passed from the Vero 4k as standard so my lumagen can read it?

I am convinced the picture quality from the Vero 4k differs from that of my Panny UB900 while playing the same movies, and wondered if this might be a contributing factor.

Thanks in advance

You will get luminance data, but not MaxCLL/MaxFALL at this time.

Thanks Sam. ā€˜At this timeā€™ implying it is an inbound enhancement?

No ā€“ we have no firm ETA for any improvements at this time.

Hi Sam,

Just a question then please.

I have just been reading a very brief article that mentioned MaxCLL/MaxFALL.

What effect on the picture does it have by not using these figures?

Does it mean donā€™t get peaks of light and dark quite right?

As you can tell the article didnā€™t have much detail on MaxCLL/MaxFALL but I was interested :slightly_smiling_face:

Thanks
-Mat

Well, depends on the display.

The static meta data of HDR (MaxCLL/MaxFALL are just parts of it) MAY be used to control color volume transformation to get closer to the intend of the mastering displays.

There is also ST2086 in the meta data which defined the three primaries, white point and min/max luminance of the mastering display.

As displays are so different what they actually can display a lot of ā€œblack magicā€ (read 3D LUTs) are happening on the player and or display size. Or none on the lower endā€¦

In any case if your displayā€™s capabilities are below a streams capabilities some color transformation needs to take place. If a display is beyond it, it just doesnā€™t care. Most manufactures do their own stream processing anyway based on their analysis of the picture as the meta data is too generic anyway. Its more of a ā€œhey use preset Xā€ kinda thing.

Consumer HDR displays have varying peak brightness, black levels and color gamut. So it is up to the display or players to map the color volume of the stream to the displays as good as possible.

If this is not happening the stream, parts of a frame may be too dark or too bright compared to the mastering display. But usually you would barely notice it unless someone entered some extremes for a stream.

Your display will display the colors wrong in any case compared to the mastering display. But that is expected, we talk consumer devices here. Same as in the SDR world: some are good, some are worse, but non consumer device ever was capable of displaying a frame as a professional display.

Note that decoding this metadata is optional as per specs for Phase A devices. Similar for rendering. Phase A devices need to be able to process HDR10, either with or without meta data.

As said most displays donā€™t care and manufacturers have their own stuff implemented across their lines depending on what displays they used.

Just donā€™t get to hung up on it. The Meta Data is only important if you have a top shelf displays that you calibrated and want it to get as close to the mastering displays as possible. Itā€™s just icing on the cake.

In reality all displays do a ton of post processing as the displays are not even close to HDR10 specs - there is no standard here beside accepting the signal.

Projectors might be more sensitive to this information as they require adjustments in the peaks due to the nature of being a projector and use that meta data for their own color transformation.

But sure, passing the data through in case a display uses it can never hurt, but the specs donā€™t even require it - and having it will be no magic pill. Your display has the same limitations as before. Might kick in some other processing though to cover up the weaknesses.

Good info. I canā€™t comment on the specifics as I am not familiar, but Lumagen have introduced a feature on thier processors called intensity mapping which does adjust the picture based on the master level of the source, hence my query. I.e. I found fantastic beasts and Pacific rim to be so dark in places they were difficult to see any detail at all in dark scenes, whereas other discs were fine. This was because the original calibration catered for the disc that was used at that time. My calibrator came back and calibrated against multiple discs, reading the maximum nits the discs were mastered at (I think - he mentioned 2000 and 10000 nits) as the benchmark. So I now have 3 calibration settings, one of which will kick in based on the level supplied by the meta data. Without it it will default to the lowest setting.

Wow! Quite complicated then.

I wonā€™t worry as you suggest :slight_smile:

Thanks

-Mat

Well, we could throw Dolby Vision, HDR10+ or various implementations like LGā€™s ā€œActive HDRā€ into the mix. Itā€™s a lot of techno babble.

All come down to: processing HDR signals vs. actual displaying them.

And then LED vs. OLED of course and various other things.

As said, if you enjoy your picture everything is fine. Get your display calibrated in any case. And if you have obvious issues switch profiles manually if needed.

But one thing remains: If someone shows you the shortcomings of your HDR display, you will never be able to unsee it. Even worse once your got used to OLED displays, then LED (especially edge lit) will just be horrible for you (yes, even your trusty old top shelf SDR one).

Anyway, donā€™t know what is needed to make the Vero pass through the Meta Data.

If @sam_nazarko could be a bit more specific maybe some people are willing to help to get this enhancement started. Also with emphasize on Dolby Vision and HDR10+ that allow meta data to be changed during a video stream.

Sadly I currently do not own a HDR display (screening at a friend often with his great LG OLED) - or else I would jump in and help out for some enhancement for the future.

In any case some elaborations on current limitations would be highly welcome for those interested in the technical things a bit more. And what can be overcome by software. You never know who might start to work on a pull request :smile:

We need to fix banding first, then we can develop things further

1 Like

I have an LG B6 OLED having come from an edge lit Panasonic and I could never go back to LED now :slight_smile:

Itā€™s not calibrated professionally but even just with my tweaking itā€™s the best display Iā€™ve ever owned by far.

Thanks for all the info - very interesting

-Mat

Hey Sam, i know banding is still being fixed so no movement in this space yet. However given the way you offered the ability to manually set the roudning value to help test and resolve the banding problem as a workaround. Would it be possible to expose MaxCLL/MaxFALL in the same way, i.e. we can manually send a value for each of them for the playing movie. Even if we have to turn their decimal value into something else first hex for whatever.

Would help determine if there is actually benefit for some of us or not, i.e. i believe it will help my projector tone map but being able to tweak on the fly would prove that one way or another! Even if not yet maybe after banding is fully resolved? Just a thought :slight_smile:

Hi,

A solution for banding was presented a while ago and will be in the next automatic update. See HDR Banding issue - #182 by MikeDelta.

Thereā€™s no ETA on this at this time unfortunately.

Sam

Yep sure - very aware of the HDR Banding workaround - as iā€™ve been using it for sometime. Glad that the auto fix is almost there.

My point was more that the approach taken with getting the community to try out the Banding fix via manual settings while running a video led to a quick consensus on it working.

If ultimately the same approach could be taken here at some point - it might decide if its worth it - and expedite an outcome.

Yes ā€” it will take a bit of study to work out how to pass these values however. We may also need some support added to Kodi ā€” although I believe this is being worked on.