Whenever I play 2160p HDR files through kodi, its hit-or-miss as to whether playback is in HDR. I determine whether playback is HDR by the super saturated constrast of the kodi playback controls (i’m using Unity skin for kodi) - not to mention high constrast picture.
I’ve run the cmd: cat /sys/class/amhdmitx/amhdmitx0/config to show file details of two HDR files I played on Kodi. Both are HDR, but only one (LOTR ROP, shown right in table below) shows on my TV in HDR. The file on left does not display in HDR.
Is there anyway to get all my HDR files to display in HDR, and not just some?
With my samsung i use the voice control on the samsung remote and say “info”. It will then bring bring up the osd banner indicating hdr or not. Works with my 3 year old samsung but don’t think it does work the 2 year old one ( the info that is)
No visual indication or message given on the TV. The only way I know its active, is by super high contract on the kodi (unity) skin, and the improved picture.
In past few minutes I’ve also exprimented by forcing SDR (as opposed to Auto select) on the HDR file (LOTR) that it normally works on, and can see that kodi ui now appears normal, and no HDR applied. The picture actually look very dark in this case, and is different to HDR files that don’t work, which appear with normal brightness settings as per non HDR files.
When I press “info” button on remote, the OSD shows (for HDR active file)
“HDR10+ DD+ UHD”.
For the other HDR file (which i describe as ‘not working’), the OSD shows “HDR DD+ UHD”.
Picture is fine, but not as high contrast, and the tell tell sign of kodi on-screen play controls appear as normal, and not super saturated, compared to files where HDR is active.
Not sure why I didn’t do this before, but I’ve reviewed 20 of my 4k files, and can now see that those files which show a super high contrast (causing kodi OSD to be super saturated), are all HDR10+ files.
My 4k HDR files don’t look as super contrasted (with normal kodi OSD), but peering into the TV picture settings shows all contrast, brightness set to max (not settings I configured).
My 4k files (not HDR/HDR10+) look pretty much the same as my HDR files except the picture settings are set to what I configured (not max).
Based on this I conclude that HDR is likely working, but just that the difference between SDR and HDR is not noticeable (or at best ultra slight, on my TV).
The HDR10+ picture quality is very noticeable, and probably what I confused for HDR being active. Will do a bit more investigating, but I feel this is pretty much the case.
Thanks for reporting back. I checked out a few files here. With HDR10, the OSD on my Panasonic is marginally paler than when playing SDR content. My TV, like yours, winds up the backlight and contrast when playing HDR. Maximum nits is a poor 350 so they have to do that to make it look ‘better’ than SDR.
My TV doesn’t do HDR10+ but I’m aware the OSD can do funny things when playing HDR10+. Obviously, the TV can’t know what is a OSD signal and what is video so it’s going to apply any corrections to them both.
Sounds like your TV is using different picture settings based on input signal and it is just a matter of setting them to your preference while playing different types of files. It also sounds like the default for the HDR10+ is set by default to be highly inaccurate to make the picture pop when viewed at a store. My LG OLED came set this way and it was pretty gross looking IMO. I think most TV’s nowadays have fairly decent calibrations when you switch them over to the movie/cinema color mode.
Thanks for your responses guys. Appreciated!
It does seem that HDR is nothing more than preset picture settings, though with HDR10+ the contrasts seem way beyond anything I can manually set up. Also, forcing SDR on an HDR10+ file alters the 10+ picture in a way that it doesn’t for HDR (i.e. 10+ picture becomes very dark). It leads me to think there’s a lot more going on than just picture setting config in the file.
For now though, I’ll stick with 1080p d/l’s choosing 4k only where it has HDR10Plus.
Also make sure your HDMI setting on your tv are set to expanded, for whatever port the Vero is connected to. My brother’s LG had to be changed to enable HDR properly from external HDMI sources (a Vero 4k+ ). I think Samsung calls this input signal plus.
HDR10+ is also called HDR2. If you tag your filename with “.HDR2” Kodi will pick that up and show you the media flag for HDR2 vs HDR (if your skin allows).
No, it won’t (in general). Currently, Kodi doesn’t store HDR information in its library as skin readable values. and thus there’s no “proper” way for skins to show media flags depending on this.
What you’re referring to here is an implementation purely based on filename that has nothing to do with Kodi in general, but it seems that there’s one skin out there that uses the “.HDR2.” filename tag detection to show a HDR10+ label. A HDR2 filename tag is in no way a standard and tbh I’m glad it isn’t
With the next Kodi version we’ll get a dedicated infolabel for skins to show HDR media flags properly (it won’t be able to differentiate between HDR and HDR10+ though as of the time of writing): https://github.com/xbmc/xbmc/pull/19983