I know that the Vero devices don’t support DV playback and those files get played in HDR10 instead, at least my TV switches to HDR when playing DV files on the Vero. I am just wondering if that ends up with worse image quality vs a “native” HDR10 file? For example, if I’m looking at two movies like this:
“ Dolby Vision is built on the same core as HDR10 , which makes it relatively straightforward for content producers to create HDR10 and Dolby Vision masters together. This means that a Dolby Vision -enabled Ultra HD Blu-ray can also play back in HDR10 on TVs that only support that format.”
If I understand this correctly every DV discs comes with a HDR10 version on discs for the sake of backwards compatibility. So if that’s the case it doesn’t really matter. But If I could pick a version I’d grab the HDR10 version because if nothing else it’s a smaller file then DV
@Snorefingers Copy the file on a thumb drive / USB stick, plug that into your TV, and play the file with the TV’s built-in player. If you can’t tell the difference between DV and HDR10(+), just use non-DV files.
Interestingly enough, playing movies with my TV’s (LG CX) native Plex app works to get Dolby Vision for some movies (WEB-DL) but others just play in regular HDR (remuxes). Perhaps a bitrate restriction since that’s way lower on the WEB-DL? I can’t really make heads or tails of it!
Using Plex is a clever workaround for this. Do you have transcoding in Plex enabled or disabled?
You can exclude bitrate for being the source of the problem by testing with a remux that comes with DV. I’ll pm you a pic of the eight examples I found.
First off, “WEB-DL” is a term you should be careful about using, because it is linked to illegal downloading, which is not something we want to promote around here.
Second, the difference is not likely to be anything to do with bit-rate, it will be the difference between single-layer and dual-layer DV. The former is what’s used in streamed videos on (e.g.) Netflix or Disney+; the latter is what’s used on UHD blu ray discs. Many players can handle the single-layer variety, but far fewer can handle dual-layer.
Thank you, it was indeed related to the layers you mentioned and not to bitrate. It appears dual llayer MKV support is still in it’s infancy in the PLEX/Kodi space in general so for now, the discussion can be considered closed.
There might be a different container format that would allow your TV to play the video - perhaps using the default video player rather than the Plex client. I seem to remember that most LG TVs from 2017 onwards can play dual-layer DV if it’s correctly remuxed to MP4 (although last time I checked that meant losing HD audio). My 2016 LG TV can’t do that, but my Oppo blu ray player can handle dual-layer DV in TS format, even though it can’t handle it as MKV.
I have found out about Realtek RTD1619DR based devices that support Dolby Vision playback, is there any chance we’ll see a new Vero model in the future using that or a similar chipset?
As @sam_nazarko has written before, it is not about having a chip being supporting it but it is about how far OSMC need to be locked down to support DV and the legal agreement with Dolby.
If I understand his responses correctly, it’s theoretically possible and a question of if he wants/plans to do it, something I’m hoping to get a statement on. Also, the part about DV not being supported in MKV is outdated as far as I know, as there are dual layer Dolby Vision .mkvs now.
I understand that there are reasons why it might be unlikely to happen but, again, I was really hoping to read a response from Sam himself on where things stand currently and looking towards the future.
I was curious if that was still the case, as devices like the Zidoo Z9X offer DV support and do so, to my knowledge, without the expensive hardware or hacked license of the Oppo clone device. I gather from your response though that the above mentioned talks with Dolby have ceased and there are no plans currently for DV support via a new hardware device?