Doing fades that way is a very clever way of avoiding banding etc. - though I wonder how real-world editing workflows actually generate the video and RPUs.
**EDIT - aah - reading up on Dolby Vision mastering in Resolve Studio - the DV analysis detects fades/dissolves by the look of it **
I don’t understand that ? Are you saying that DV Profile 5 encodes get a re-grade - if by mastering at 4000nits you mean the grading decisions are taken when monitoring in the 4000nits domain (i.e. what the person taking the creative decisions is watching and taking those decisions based on)
Or are you saying that they are exported and tone mapped from whatever the original mastering grade monitoring was to 4000nits?
This is all I know:
- for some reason, in profile 5 1023 encoded corresponds to 4000, not 10000.
- Level 6 in the dynamic metadata contains max/min master display luminances and all the clips I have say 4000 in there.
Quite how these relate to each other and how the metadata is generated if the colourist has a 1000nit display I’ve no idea.
Now I’ve started reading up on mastering DV content I’m sure I know less than I thought I did!
I can also see why broadcasters have standardised on HLG for live production in HDR…
https://professionalsupport.dolby.com/s/article/Quick-Start-Guide-Dolby-Vision-DaVinci-Resolve-Studio?language=en_US is interesting - as it gives you some insight into the various levels of Dolby Vision metadata and how it relates to viewing etc.
As does this https://professionalsupport.dolby.com/s/article/Dolby-Vision-Metadata-Levels?language=en_US - but suspect you’ve read both already!
And now I get where the 11.5 bit vs 10 bit analogy comes from - it’s comparing YCbCr and the DV ICtCp spaces and the quality they deliver by suggesting that YCbCr needs 11.5 bits to deliver what DV ICtCp delivers in 10 https://professional.dolby.com/siteassets/pdfs/ictcp_dolbywhitepaper_v071.pdf
Welcome to Wonderland. Profile 5 sits behind a couple more veils. There is no Level 0 metadata, master display luminances being added to Level 6. There’s also a Level 4 printed by dovi_tools which doesn’t appear in that second doc you linked. I don’t think anyone in the FOSS community has figured out what that does although quietvoid has some names for the parameters.
So far the DV5 content is working for me although it seems to lack any pop. Mind you it was Marvel and thier DV output always looks dull and drab to me. Still testing.
Would I see any benefit to upgrading my hdr10 media to DV?
Most likely not, if anything the HDR10 will look better in this case.
Some Dolby content is over-popped. It’s a deliberate marketing ploy to make DV look much better than HDR10+
With that said, this is only the start of our endeavours with DV support. We’ve a lot of capability with this device (HDCP2.2; Widevine L1) and we are just getting started.
Fun times ahead for our Vero V users
I have tested more clips and the DV5 files are definitely darker than equivalent clips on Netflix (played also in DV) on my setup. All going through an AVR so the TV settings are the same of each source.
We seem to be getting more reports of ‘too dark’ than ‘too bright’ or ‘about right’. Hopefully the devs will be able to wind up the contrast without blowing the highlights. Thanks for the feedback.
Indeed, this is why things are still being tested
I concur, too dark for me, too.
Definitely darker than the same file being played on a Homatics DV-friendly CoreElec platform (and the Homatics looks pretty much the same as the content streamed from source)
It’s useful you have that at hand. I am sure we can make some changes to brightness and compare them.
I can capture the output of various boxes losslessly via HDMI at 2160p59.94 and below - though probably only in 4:2:2 YCrCb format (LLDV uses 4:4:4 RGB as a transport medium I think). Not sure what metadata is captured though - if any.
If I get bored I might have a look.
One other thought I had was whether mastering controlled DV content with metadata that is known would be useful - though it seems the DV mastering licence is US$1000 for Resolve Studio
Just a thought… There are likely to be all kinds of calibration differences between DV and HDR10 at the display end. Given that what we’re actually outputting here is HDR10, would it make more sense to do side-by-side comparisons between a DV stream tone-mapped to HDR10 and the equivalent HDR10 stream, as opposed to comparing tone-mapped DV with actual DV output? If the ultimate source is some kind of streaming service like Netflix, it will likely offer both HDR10 and DV versions of the same video.
You might be right, but I suspect LLDV is actually 4:2:2 and I read somewhere it includes no metadata. That makes sense as the player is supposed to be doing all the signal conditioning, based on what the display tells it in its EDID. The most a player would need to tell a display via metadata would be ‘here is some LLDV - don’t mess about trying to adjust it’. Otherwise, LLDV seems to be the same as HDR10. If it were not, the trick with forcing player-led processing with an HDFury device wouldn’t work with non-DV displays.
It’s ‘standard’ DV used in display-led processing that looks to the outside world like RGB 8-bits. The HDMI signal wraps a 12 (or 14?) bit signal and includes metadata that tells the display how to process it.
I do have a (cheap) HDMI capture device and have measured the difference between the output from this testing kernel and the ‘same’ stream encoded as HDR10. It’s 2 stops darker so at least we know what the issue is.
An HDFury device may be able to extract DV metadata. I don’t have any players that output DV via HDMI to test that. Do you have a Vertex, maybe?
Yes indeed, and it’s easier to flick between two different devices on the same display than it is to switch between an HDMI input and the internal player on a DV TV.
I have a Blackmagic 4K capture solution and an HD Fury Vertex (v1)
My Apple TV 4K (latest gen) is DV LLDV I believe - and it is reported by my Denon AVR to be RGB 8-bit output (but it’s DV aware and flags it as Dolby Vision)
My HD Fury reports 4K50 RGB 8b DV as the output format from the Apple TV 4K in DV mode - and the HD Fury OSD is cyan rather than green - suggesting that the output may be ICtCp rather than YCrCb?