Given the welcome HDR improvements over the last few updates, one thing remains that is perhaps slightly less than ideal. That’s is how I typically have to change picture settings between HDR and SDR content. In my case, owning a Sony 55” XE9305, when HDR content is seen on an HDMI port, it sets brightness to maximum leaving contrast as it is. Mostly because the set doesn’t maintain separate settings for HDR and SDR inputs for each picture type (custom, standard, movie, etc). When the December update rolled out, it was this behaviour that made me think HDR playback wasn’t improved - it was just my contrast remaining untouched.
Anyway, I was thinking about whether it was possible for OSMC to be able to signal a different (configurable?) HDMI content type when playing back HDR content. On devices such as rPi, this is achieved by the edid_content_type state. I’m unaware as to whether the AMLOGIC libraries used by OSMC on Vero/4K allow for this to be changed though.
If I could specify the content type number to be used when playing HDR content, I could have my TV automatically change to a suitably adjusted picture preset when necessary.
Is there any mileage in this as a request for OSMC?
Indeed. But as OSMC knows when HDR content is playing, and it controls aspects of HDMI setup for playback, it’s in a position where it could change the content type attribute when configuring the output parameters.
For example, OSMC leaves the content type at 0 (none) in all cases apart from when HDR video is being played, in which case it sets it to a different content type (eg: 3 - indicating a mode of “Cinema”).
Yes. But allow the user to specify the content type to use for HDR rather that wiring it. And leave SDR as a content type of None - as it is right now.
I’d expect that in the absence of edid_content_type in config.txt, it’ll default to 0 - indicating a content type of NONE. But unfortunately I’m no longer in possession of my OSMC/pi setup, so can’t verify this.
There’s likely a register we can poke to do this. Documentation is key.
I upgraded my Beamer recently and sometimes have issues switching between HDR and SDR. I think it’s a bug with the projector though, but this might contribute to a workaround.
It looks like the content type is just encoded as two bits in the AVI InfoFrame packet (in the same packet byte as the YCC quantization range bits used to select limited/full range).
Actually, this tweak would be a poor band-aid to the real problem.
HDR10 was supposed to make it so you got to see on your screen as close as possible to what the director saw on his. But (a) each director/grader/studio has different ideas about how to encode the content and (b) each TV manufacturer has different ideas about how to display something which was mastered with a bigger gamut than domestic TVs can show. There is no standard for that afaict. And many manufacturers seem to be getting it wrong.
Really, the right way to tackle (a) is to store settings for each title as they do seem to vary. The only way to do this atm is via the brightness and contrast settings. By that, you should also be able to compensate for (b) if the TV manufacturer’s choice is not to your liking. Have you tried that?
Much as you now set the YQ bits in drivers/amlogic/hdmi/hdmi_tx_20/hw/hdmi_tx_hw.c as part of the limited/full-range configuration, I believe that the CN bits define the 2-bit content type (defined in the HDMI 1.4b specification, in Table 8.2 - as part of PB5) that’s sent as part of the AVI InfoFrame packet.
Apologies, but I’m not completely sure what you are suggesting… Are you saying that the brightness/contrast values used during decode (ie: on Vero) should be stored/tuned on a per-title basis to deal with things correctly? Ignoring the practicalities of having per-title settings for these stored somewhere, having increased contrast/brightness settings on decode isn’t going to change the actual contrast/brightness on my panel. So HDR content would be clamped to the contrast/brightness settings on the panel used for SDR content. And as a result, look dark.
I guess my suggestion was a mechanism whereby the panel properties could be calibrated (in a sense) by the user, via the use of picture mode settings. In my case, on an XE9305, both Netflix/Amazon Prime show HDR content using a specific picture mode (which is not available for manual selection by the user)… so the approach of having one of the picture modes have equivalent settings that could be used - and automatically selected - for externally played HDR titles (ie: from Vero 4K) seemed like it could be useful.
Kodi allows you to store contrast and brightness settings per title via the context menu when you are playing a video. It keeps these values in the database. Those settings will affect what you see on screen, but your TV will still process it as HDR (ie with a PQ EOTF). Some users have reported in here the settings they find work for them (with different TVs from yours).
I understand what you are suggesting. Just mentioning what you could try with the existing options.