Vero 4k+ HDR Optimizer?

fauxK displays are surprisingly good

Using a UHD-51A here, but unfortunately it’ll be a little while before I can use it to its full potential again as I am having some work done in the house.

1 Like

Hi Sam, are you talking about a tone-mapping feature? I’m also very interested in this, has any progress been made since your last comment?

We want to get our new video stack out, then we can dive in to this

Sam

2 Likes

Excited for this!

Waiting with fingers firmly crossed :slight_smile:

Very pumped for this, been patiently waiting knowing you guys would get to this when you can. Thanks for keeping this on the roadmap!!

1 Like

Was just curious with the new kernel implemented is this something that is being considered? Thanks

The latest update already has tone mapping and a number of HDR improvements.

I didnt know how that compared to the specific feature in how the HDR optimizer worked as my understanding was that was a bit different than straight tone mapping, but I am no expert so just wanted to check, Thanks for the quick reply Sam! great work, love my Vero 4k+

Just re-read this thread - can’t believe it’s 2 years long!

Sitrep on HDR under Kodi matrix and kernel 4.9 on Vero 4k:

  • for HDR sources feeding HDR displays we don’t implement any adjustments**
  • the basic tonemap for HDR-SDR is the same shape as it was under 3.14, but
  • with 4.9 use is made of the metadata (maximum display luminance and/or MaxCLL) - previously, a fixed value of 1000 nits was used, and
  • there’s a setting which can be adjusted depending on your display maximum luminance. This was added to help mainly projector users but we haven’t had much feedback on it and the effect is quite subtle.
  • with 4.9, you can adjust the contrast upwards without blowing out the highlights completely. This setting is used so it can be saved ‘per video’

** This was done so that users with two displays, one HDR, one SDR on a shared Kodi library could set the best value for SDR without changing the look on their HDR display.

Introducing new ‘per video’ settings to be saved in the library is to be avoided - we don’t want to mess with Kodi’s database schema. They do now have a tonemap type and ‘parameter’ setting but these are used very differently on PC from how we would want to use them so again you have the problem of multiple Kodi clients/displays using the same database.

I don’t know exactly what magic Panasonic have. If their tonemapping is dynamic, that’s something that’s hard to do with our current Vero hardware.

Thanks for the detailed reply @grahamh much appreciated. You mention

What setting is this, i have a projector (JVC X570) and would love to play around with it and can let you know how it works out.

Thanks

The setting is at Settings->System->Display->Display maximum luminance. At present, it has little effect but that will change with the next OSMC release.

1 Like

Are you allowed to tell us what it will do?

And are there any plans to add an HDR → HDR tone-mapping function? This is something I would find quite useful, along with most other people who own a 2016 vintage TV (and probably a number of others too). 2016 TVs (with the exception of Samsung models) nearly all have broken tone-mapping for material mastered as 0-4000 nits - everything above 1000 nits just clips. It would be lovely to have the player tone-map that. The process could be similar to HDR->SDR tone-mapping, with a specified max luminance and perhaps some control over the tone-mapping curve (the way the Contrast setting works with SDR tone-mapping). I guess you might want to tweak the output metadata too.

the default curve will be brighter since most comments in here are that HDR->SDR is too dark.

It’s difficult enough mapping from HDR which, despite the intentions of its proponents, depends on the whim of the colourist to SDR that’s reasonably predictable on a calibrated display. To cater for displays with rogue and unknown tonemaps - well I can’t immediately think how that could be done. But never say never.

It doesn’t seem tremendously complex to me - but perhaps that’s because my understanding of HDR is faulty. Maybe you can correct me?

Let’s suppose we have a film with a MaxCLL value of 1500nits. Suppose that the maximum brightness of the display (small window) is about 700nits. We tell the player in the settings that the display max brightness is 700nits. The first thing that does is alter the output metadata, so instead of telling the TV that the film is 0-4000nits with a MaxCLL of 1500, it tells it that it’s 0-1000nits with a MaxCLL of 700. (Hopefully the display now won’t do any additional tone-mapping, because it thinks that the signal is entirely within the range that is physically achievable - but if not we can tweak that in a minute. At the very least, by always passing the same metadata we ensure the display is always using the same tone-mapping, regardless of what that might be.)

Now we tone-map 0-1500nits into the range 0-700. I imagine this calculation would work much the same way as the HDR->SDR tone map works, except with a bit more headroom. So the curve would probably roll off gradually, and you could tweak the shape of the curve in the same as you can with the SDR tone-map, using the Contrast setting.

Additional check: if the real MaxCLL for the movie is less than the specified max display brightness, then we (probably) want to disable tone-mapping automatically.

Now suppose the tone-mapping on the display is messed up, and it actually always tone-maps a 1000nit signal to 700nits physical brightness, even when the MaxCLL is very low. We can compensate for that by telling the player that the max display brightness is 1000nits instead of the real physical value (and possibly tweaking the shape of the curve a little too, with a default “Contrast” value).

Getting it working optimally will inevitably require some trial and error on tricky displays; and in cases where the display tone mapping actually does what it should, you probably wouldn’t want to activate player tone mapping at all. But I think a lot of people would find some use for it. (As discussed previously, Panasonic UHD blu ray players have this ability, and it’s one of their major selling points).

So, what am I missing? Is it just that you think the display’s own tone mapping is too likely to screw up what the player is doing, or is there some other factor as well?

Well I suggest anyone with such a problem display turns off its HDR capability or forces SDR output with the switch we have added and tries our HDR-SDR mapping. IIRC someone in here is already doing that.

Won’t that give you banding because you’re dithering a 10-bit signal down to 8-bit? On a projector, that might be viable, because the maximum brightness it’s capable of isn’t that high, so you have a sharply reduced dynamic range anyway; but on a TV that can hit 750nits, it seems like that would be a problem.

We always output 10bit, if the display supports it.

Just tried this out. It looks better than I expected - not much visible dithering - but somehow very “flat” compared to true HDR. It might help if the “display max luminance” had a much higher maximum value. I would need 800nits at least, and you might need even more for brighter LCD screens - 1500, maybe.

I reasoned that no SDR-only device would be more than 350nits but for this use case it should go higher. I just need to check it doesn’t blow up if max nits is more than MaxCLL.

Are you on staging? If not, don’t switch now, there’s some things to sort out.