Doing fades that way is a very clever way of avoiding banding etc. - though I wonder how real-world editing workflows actually generate the video and RPUs.
**EDIT - aah - reading up on Dolby Vision mastering in Resolve Studio - the DV analysis detects fades/dissolves by the look of it **
I donât understand that ? Are you saying that DV Profile 5 encodes get a re-grade - if by mastering at 4000nits you mean the grading decisions are taken when monitoring in the 4000nits domain (i.e. what the person taking the creative decisions is watching and taking those decisions based on)
Or are you saying that they are exported and tone mapped from whatever the original mastering grade monitoring was to 4000nits?
Welcome to Wonderland. Profile 5 sits behind a couple more veils. There is no Level 0 metadata, master display luminances being added to Level 6. Thereâs also a Level 4 printed by dovi_tools which doesnât appear in that second doc you linked. I donât think anyone in the FOSS community has figured out what that does although quietvoid has some names for the parameters.
So far the DV5 content is working for me although it seems to lack any pop. Mind you it was Marvel and thier DV output always looks dull and drab to me. Still testing.
Some Dolby content is over-popped. Itâs a deliberate marketing ploy to make DV look much better than HDR10+
With that said, this is only the start of our endeavours with DV support. Weâve a lot of capability with this device (HDCP2.2; Widevine L1) and we are just getting started.
I have tested more clips and the DV5 files are definitely darker than equivalent clips on Netflix (played also in DV) on my setup. All going through an AVR so the TV settings are the same of each source.
We seem to be getting more reports of âtoo darkâ than âtoo brightâ or âabout rightâ. Hopefully the devs will be able to wind up the contrast without blowing the highlights. Thanks for the feedback.
Definitely darker than the same file being played on a Homatics DV-friendly CoreElec platform (and the Homatics looks pretty much the same as the content streamed from source)
I can capture the output of various boxes losslessly via HDMI at 2160p59.94 and below - though probably only in 4:2:2 YCrCb format (LLDV uses 4:4:4 RGB as a transport medium I think). Not sure what metadata is captured though - if any.
If I get bored I might have a look.
One other thought I had was whether mastering controlled DV content with metadata that is known would be useful - though it seems the DV mastering licence is US$1000 for Resolve Studio
Just a thought⌠There are likely to be all kinds of calibration differences between DV and HDR10 at the display end. Given that what weâre actually outputting here is HDR10, would it make more sense to do side-by-side comparisons between a DV stream tone-mapped to HDR10 and the equivalent HDR10 stream, as opposed to comparing tone-mapped DV with actual DV output? If the ultimate source is some kind of streaming service like Netflix, it will likely offer both HDR10 and DV versions of the same video.
You might be right, but I suspect LLDV is actually 4:2:2 and I read somewhere it includes no metadata. That makes sense as the player is supposed to be doing all the signal conditioning, based on what the display tells it in its EDID. The most a player would need to tell a display via metadata would be âhere is some LLDV - donât mess about trying to adjust itâ. Otherwise, LLDV seems to be the same as HDR10. If it were not, the trick with forcing player-led processing with an HDFury device wouldnât work with non-DV displays.
Itâs âstandardâ DV used in display-led processing that looks to the outside world like RGB 8-bits. The HDMI signal wraps a 12 (or 14?) bit signal and includes metadata that tells the display how to process it.
I do have a (cheap) HDMI capture device and have measured the difference between the output from this testing kernel and the âsameâ stream encoded as HDR10. Itâs 2 stops darker so at least we know what the issue is.
An HDFury device may be able to extract DV metadata. I donât have any players that output DV via HDMI to test that. Do you have a Vertex, maybe?
Yes indeed, and itâs easier to flick between two different devices on the same display than it is to switch between an HDMI input and the internal player on a DV TV.
I have a Blackmagic 4K capture solution and an HD Fury Vertex (v1)
My Apple TV 4K (latest gen) is DV LLDV I believe - and it is reported by my Denon AVR to be RGB 8-bit output (but itâs DV aware and flags it as Dolby Vision)
My HD Fury reports 4K50 RGB 8b DV as the output format from the Apple TV 4K in DV mode - and the HD Fury OSD is cyan rather than green - suggesting that the output may be ICtCp rather than YCrCb?