I’ve actually checked this and can confirm that sending a 10-bit 444 signal when playing back SDR content should not cause any ill effects.
The device will send a 10-bit signal, but the additional bits will be zeros for SDR content. This will be ignored by the receiving device if it follows the spec. If things look different, then it’s possible that your display is performing some dithering.
This has been confirmed by some other users, who state that there is no problem playing SDR content when Vero 4K is sending a 10-bit signal. I believe that others would have noticed colour problems if this was a widespread issue:
However, I can see merit in outputting an 8-bit signal when Rec709 content is being played. Even if just for the fact that you should be able to see how Vero 4K is interpreting the file and what the signal should be.
Hey Sam. You configured my box to send 10 bit for SDR as well. There is no problem if the box is sending 10 bit signal with SDR content. Nor there is a problem when the box sends a 10 bit signal with 4k HDR(Bt.2020) content. The problem arises when it sends a 10 bit signal with 4k BT.709 content . In this situation the box switches to Bt.2020 when the content is BT.709. Thats where the oversaturated colors and skintones appear. And again bit depth has nothing to do with colorspace. You can play something that is in SDR by upsampling it up to 12 bits but the colorspace should remain BT.709 because that is how its encoded. 4k files encoded in SDR have the problem am trying to illustrate.
This shouldn’t be happening.
Anyway – this is now being worked on and when I have some progress I will of course let you know. It definitely makes sense to add some GUI options to configure all of this.
This would be a great option as I believe the file should always be Output to the TV as it is encoded. Content is pretty limited at the moment, but will get more common as time goes on.
Thanks for looking into this Sam.
I’m a bit confused as a new owner… do I need to set the 444,10bit mode to the attr to get the proper HDR playback or should I leave the device configuration as is?
I can’t really say I see any difference when playing 4k HDR mkvs, and leaving the color space to auto on my TV seems to yield the same results as setting it to BT.2020 manually.
What I did notice was over-saturated colors when playing 8 bit 4k mkvs, if I set 444,10bit - same as the others. I could fix this by manually setting the color space on the TV, but using auto would throw it into BT.2020. Leaving the device as is seems to send the correct color space information and there’s no over-saturation.
Thanks for the help Sam. I’m looking at the output, but don’t see any mention of color spaces? Only states “Traditional HDR: 0”, which I would presume would have to be 1 and not a 0, yet the TV switches to HDR mode when playing the appropriate mkv.
I must admit I don’t completly understand how the output of hdr_cap could be linked to outputing 10bit or 8bit color when in HDR mode?
Deleted my advancedsettings.xml, rebooted and all seems well - no buffering (so far. only played 5-10 mins of a few files).
However, cleared the rc.local file and now my AV receiver (Marantz SR7011) doesn’t detect 10bit. It did report 10bit for everything, including 8bit files before this change.