I’m running a multi-client environment with mix of HDR and SDR displays, been looking again at the Vero’s SDR conversion on the latest build, and I’m finding that in spite of improvements compared to the past, the Vero still under-shoots the contrast on quite a lot of titles. Adjusting contrast via the GUI during playback fixes it, but I wondered if there are any ongoing efforts in this area? I had a play with the new display settings and that did not fix it. I also have a Shield and it produces a much more dynamic SDR conversion, generally more watchable, though it can get caught out the other way, e.g. it can be too “hot”. So just interested to know if this is still an active area for the Vero.
So are you after something more than that?
That suggests there’s no one size fits all setting.
Well I just wondered if this is still being worked on, in case a global preference could be set.
The only global preference atm is the display maximum luminance. I was hoping to get some more feedback from testers so that if there was a concensus (say, everyone finds setting contrast to 55% gives the best results) we could bake that in.
Then, longer-term, I may find time to add support for user-defined custom LUTs.
I have quite a collection now of titles with both HDR and SDR mkv’s, and having gone back and forth I think a contrast setting in the region of 55% would work as a good default (I would go for 56), but others will have their own view. At the moment I only have one Vero on my network so I’m assuming that title-specific contrast settings applied via the Vero will not affect other devices via the shared database, e.g. Kodi on the Shield does not have an in-play contrast adjustment that I can find, but I have not yet ruled out that this sort of knock-on effect could happen, since I want the adjustment to be entirely local.
Thanks for that. The brightness and contrast settings are kept in the database, so if that’s shared it will apply to all devices. If the Shield respects that setting it will implement it differently from the way we do on Vero.
Based on limited testing (eg setting contrast to 0% on the Vero) it does not look like my Shield is affected by database contrast values, so it may be safe for me to adjust on the Vero. While the Shield is sometimes excellent with its SDR conversion, it can err on the bright side and sometimes even burns out the highlights quite badly, whereas the Vero is more conservative but at least has tuning options.
I have 2 veros. One is on a TV with HDR, and one is on an older TV without it. They would need to handle the video differently. I imagine that anytime someone has more than one TV, there’s a high chance that each TV would require different tuning to look right. I don’t know if those differences could be handled just by a global (local) setting, which a video-specific setting is just a small offset to.
I’m not sure we can design a perfect system with a matrix of settings for each display and video and it might confuse people.
I propose we just disable those video-specific settings when playing HDR to an HDR display. That would fit with our policy of passing HDR straight through and leaving the interpretation to the display.
I like the idea of full bypass for HDR to HDR. Would work well in a mixed environment of display types when the database is shared.