I’m enjoying my new Vero4k+, and although it works like a dream compared to my old Windows htpc, I’m experiencing some slight color banding issues. Specifically I’m running some test patterns (I’m using these of AVSForum: https://www.avsforum.com/forum/139-display-calibration/2943380-hdr10-test-patterns-set.html) where banding is very noticeable in the greyscale ramps. Running the exact same files directly off the TV with the same settings profile gives me (almost) a perfect smooth gradients, so I’m pretty sure it’s the Vero that causes this (or a combination with my TV, an LG OLED C8).
In real life movie watching though, it’s not very noticeable, but I have seen som bad gradients for example in the intro scene in Revenant.
I’m really not sure what could be the cause of this, so I’ve uploaded the logs according to the Wiki, hopefully someone who knows this better than me could tell me something useful from these? (https://paste.osmc.tv/ukayohoger). The log contains only a reboot and starting the file in question. Also, when enabling debugging, the video is not displayed correctly at all, everything looks like a “crashed computer” sort of, I’ll add the screenshot. Turning the logging off though, it’s ok, but with serious banding.!
First screenshot is the “broken” playback when debugging is enabled:
I’ve seen one “Force 4:2:0 Chroma… something” option in the menu, and since it mentioned LG TVs I enabled that one as a test too, but with same results. I’ve uploaded logs from that too: https://paste.osmc.tv/ukayohoger
I’m hoping someone could tell me whats wrong from this debugging files, I have run out of ideas.
There was, but I connected the Vero directly for this test, as I thought that might be the problem. I have not tried a second HDMI cable though, I might try another cable (using the one I got with the Vero).
I’ve also tried some other predefined settings on the TV in case something was wrong with the one I’m using, but that did nothing either.
My Vero is connected to the HDMI-2 port and has settings for “Full UHD Color” and “Full Chroma 4:4:4” which are both off by default. After turning both "ON’, there was visible difference in picture quality.
There is no setting for “HDMI ULTRA HD Deep Color”
I think that is the setting on your TV that enables UHD input which is what you are looking for. Your TV appears to also have a simulated UHD mode that it applies to regular content by default and this may be what your looking at. If you loose signal when you enable that it is likely because your going through an AVR that does not support it, or your HDMI cables are not compatible with the higher bandwidth necessary.
Well, I’m using the supplied HDMI cable shipped with the Vero. I would surely hope they supply a good enough cable? I’ve also tried a second cable, but I don’t know whether my second one is good enough as it’s an older cable.
It’s also connected directly to the TV, so no AVR in between.
I am pretty sure its proper UHD though. The UHD HDR movies I’ve watched so far has been excellent quality, way too good to be upscaled in any way. And the HDR is processed by the TV too, so everything in that regard seems to work.
Following up this previous post, I’m not saying there is nothing wrong with my setup, you might be right. I tried whats suggested there, enabling “HDMI Ultra HD deep color” and setting the input port to “PC” to enable full 4:4:4 (apparently).
Now something even more strange happened. Vero plays the video pattern seemingly better and more smooth, however, its cut the screen in 4 and only displays the video in the upper forth corner:
It looks like you might be right about that, the banding got a lot worse with these options turned on, so I’m reverting back to my starting point again.
Having said that, those options are the only options I can’t set when playing from the internal player in the TV (since those settings are linked to the input hdmi ports), and from the internal player the playback works just fint with ok gradients. The TV even streams the video file through the Vero DNLA enabled media server, so it is the exact same file.
I may not have been clear. You definitely don’t want to tag the input as “PC” but you definitely must turn on “HDMI Ultra HD deep colour” for the input in question.
That is not a safe assumption. Make sure you’re using a cable which is explicitly certified as “Premium High-Speed” and supplied with the appropriate bar-code/hologram label. (They’re not very expensive).
A poorer-quality cable can’t be causing banding, but it certainly can cause signal drop-outs and a black screen. Most likely the banding is being caused by the “deep colour” setting being turned off; but in order to turn the setting on and get a viable signal you may need a better cable.
Yes, those settings from rtings I have used already, and adjusted the contrast and brightness specifically after this. I saved the picture profile and used the exact same profile when playing the video off the TV, so I think that should be in order.
Thanks for the tip. I have testet both, that I am sure of, however I may have made a mistake when I uploaded the logs by the looks of it then. I will double check when I get back home from work today.
I think you might be onto something, it makes sense at least, as only the “ultra HD deep color” setting makes me loose the signal (it is also a setting which I have no control over when using the built in player in the TV). I will go ahead and order a new premium HDMI cable to see if this solves the issue. If it does, this will surely be embarrasing for me
Well, we’ll see, but it would account for the symptoms.
“Deep colour” means a 10- or 12-bit signal. If that setting is turned off, the television’s EDID (information that it sends to the source to tell it what type of signal it is compatible with) will say “I understand 8-bit signals, but not 10 or 12”. So the Vero would then output an 8-bit signal, and the image would show banding.
With the setting turned on, the TV advertises 10-bit compatibility, and the Vero then tries to correctly output 10-bit colour, but it is unable to establish a stable HDMI connection with the TV for the higher bit-rate signal.
(That’s my theory, anyway).
Hopefully a new cable will sort out the latter problem; I’m quite certain that the TV will not display HDR correctly if the setting isn’t turned on. (I own an LG G6, which has a similar setting).
The piece of advice about turning on the “force 422 colour sampling” is also a good one, from a signal-stability perspective. In theory any TV should be able to handle either 444/10-bit/24Hz or 422/12-bit/24Hz equally well, but in practice some are a little more comfortable with the latter.
(If the television is particularly twitchy, you may need to turn that “force 422” setting off when playing 50 or 60Hz HDR video, but hopefully not).