I’m having an issue where my TV doesn’t recognize the signal coming to it and shows green/pink screen. (“video format not supported”). After several attempts (stop the movie, start again), signal is successfully passed through and is working fine. This happens for almost all movies.
My setup is: VERO4K+ connected to soundbar and passthrough to TV.
I’m pretty sure that culprit is in soundbar because if I connect VERO4K+ directly to TV, then I don’t have the issue, all is working fine. However, I must stay in current setup because TV is not eArc capable so to get HD sound, I must have soundbar between VERO4K+ and TV.
I remember this occured around the same time when VERO4K+ was upgraded to vers.19 (and also TV did an update). On previous version of VERO4k+ I didn’t have this issue.
TV is Philips PUS7304 (4K, about 3 years old), soundbar is Samsung.
Thanks for a tip - I enabled this option, made a reboot but unfortunately didn’t help. It was still the same.
Then I tried with enabling option “Force 422 color subsampling” and that solved the problem! No more problems with passing through any content to TV. However, did this option cause picture quality degradation? What does this subsapling actually do to the ouput video?
The video in the original file is virtually always in 4:2:0 format, meaning that only one pixel in every four actually has information about both its brightness (luminance) and its colour; the other three only have brightness information, and not colour. To display the pixels on the screen, at some point you have to generate colour information for the other pixels, which you do by intelligently guessing, i.e. by interpolating between the pixels that you actually do have colour data for.
If you send 4:2:0 to the TV, then it’s the TV that generates (guesses) all the missing pixels’ colour information. If you send 4:4:4 then it’s the source that generates (guesses) the missing colour data. If you send 4:2:2 then the source is generating colour data for one of the missing pixels while the TV generates it for the other two.
So, assuming that the source and the TV both use the same algorithm to generate/guess the colour data, there won’t be any visible difference at all between the different signal formats. If source and TV use different approaches, then there may be extremely subtle differences in the final picture depending on which device does the work; but which will actually look better to you is likely to be subjective (and you quite likely won’t notice anyway).
Well, all I really said there was that, if everything is working correctly, selecting it or not selecting it probably won’t make any visible difference!
The reason why it’s there is that some devices (e.g. some displays or TVs, some AVRs) have problems correctly interpreting some kinds of signal. They shouldn’t - 444, 422 and (for 4K 50/60Hz signals) 420 are all perfectly standard signal formats that should be supported by everything. But nonetheless some devices prefer one or the other; so, if the signal is consistently not making it through to the TV properly - for example, if the screen keeps repeatedly going black in the middle of the video, or if you consistently get false colours, or if you can’t see a picture at all, then sometimes switching to 422 will help.
So I think the standard advice would be something like “if the picture looks basically okay then you don’t need to turn this setting on; but if something very weird is going on with the picture, sometimes it might help”.