10 bit and banding fixed?

HDR or SDR is just an EOTF flag (and associated metadata if required) in the HDMI stream isn’t it? The actual video signals are the same either way aren’t they? (i.e. the only difference between a 2160p60 4:2:2 12-bit HDR and SDR video signal on an HDMI cable is the presence or absence of HDR EOTF metadata?)

I thought 10 or 12 bit meant HDR, ie you can’t have 12-bit without it being HDR (although it may also be HLG).

Right, i get the meta part, it’s like having Atmos with no ceiling speakers. I guess the analogy is there’s not enough, in this case bandwidth to display the true content?

I know you can do deep dives on video calibration but i don’t think I have ever seen anything as garbled as HDR in all my time in AV and I’m sure some of this has to do with handshaking, EDID and other HDCP nonsense.

No - you can have SDR 10-bit or 12-bit video. The HDMI standard allows for both. The only difference is the presence or absence of the EOTF flags for HDR formats (and metadata if required)

HDMI has supported 10-bit and 12-bit video long before HDR was ‘a thing’ and 10-bit video has been used in production since the days of standard definition. (10-bit 4:2:2 was supported in HDMI 1.0 I believe back in the early '00s long before HDR was considered to be an issue)

It was often only DVB/ATSC, DVD and Blu-ray in the final path to the consumer that was 8-bit limited.

The reality is that almost all consumer video that is 10-bit is HDR - as HDR arrived with Ultra HD Blu-ray and streaming services that use 10-bit HEVC - but I wouldn’t be surprised if Netflix’s UHD SDR content was 10-bit still. (I’ve not captured any and checked the LSBs)

The only difference between an HDR video signal at baseband (i.e. HDMI level) and an SDR signal is the HDMI flags that say ‘This video is HDR10 EOTF with these max / average light level settings, in this gamut and with a white point here’. The actual YCbCr stream is sent using identical techniques whether the video is SDR or HDR - that bit of the HDMI transport doesn’t know or care what the display mapping (i.e. the EOTF) is.

This is why when a device plays an HDR HEVC 10-bit file correctly but DOESN’T flag it is HDR (but doesn’t do any conversion to SDR) you can inject the metadata (or force your display) to display it in HDR and get an identical result to a player that correctly flags HDR.

This is also why HLG is such a winner. If you receive a 10-bit HEVC HLG signal and output it flagged as SDR it looks pretty good, if you flag the same identical video as HLG (or force your display to HLG) then it looks good in HDR.

1 Like

We’re really only looking at getting support for 4K HDR 4:2:2 in under the wire for 10.2gbps chipsets.

None of this stuff should be an issue with an 18gbps chipset because as you rightly point out, all the stuff we watch is 4K HDR 10bit 4:2:0

It’s all the forcing and conversions that seem to be the cause of the never ending dumpster fire. I remember bit depth of computer monitors way back but the crux of the problem is replicating the actions of a blu ray player for HEVC .mkv files

Thanks for your usual in-depth clarification. Now I understand your response to @Rock_Danger which was in fact what I thought - calling it HDR doesn’t change the bandwidth required.

Exactly!

1 Like

Yes - HDR and SDR video themselves are identical in 1s and 0s terms or digital signals for a given bit depth and resolution/frame rate. The only thing that the HDR/SDR flag does is to tell a display how to map the video to output pixel intensity/brightness/light levels. This is the EOTF (Electro Optical Transfer Function).

In EOTF terms :
SDR video uses something like BT.1886
HDR PQ (Perceptual Quantizer) standards (HDR10 etc.) use ST.2084
HDR HLG (Hybrid Log Gamma) uses ARIB B.67 (which - like PQ ST.2084 is part of BT.2100)

That’s not exactly right - HDR, DV etc require more bandwidth otherwise the gbps would be irrelevant. The tagging and the message translated down the wire tells the device what it is and what to do with it. But like DTS vs DTSMA or DTS:X it has a higher bit rate and it needs something powerful enough to process and deliver it.

Feel like we’ve jumped the shark here at cross purposes - I just want HDR to work man, y’know?

Don’t mind me - I’m just an amateur pedant. :wink:

1 Like

I understand that we are going off-topic. HDMI data rate (bandwidth) is determined by the resolution, refresh rate, pixel encoding and color depth. HDR metadata and InfoFrames are auxiliary data. Audio and auxiliary data are transmitted during the data island period that occurs between the video blanking intervals. The presence of HDR metadata by itself doesn’t increase the bandwidth for a particular resolution/refresh rate/pixel encoding/color depth.

1 Like

Yep, same as Atmos.

@sam_nazarko if you’re working on this and need a tester give me a shout.

cheers bigly.

Yes, I am working on fixes for 4:2:2.
I will indeed give you a ping and make testing available when I have made progress.

Sam

1 Like

Except they don’t. HDR and SDR take identical bandwidth. The HDR10/DV metadata is carried in ‘empty’ space in the HDMI stream - its presence or absence doesn’t change the HDMI bitrate compared to an identical bit-depth/frame-rate/resolution/colour-depth SDR stream.

Of course the reality is there isn’t much 10-bit SDR knocking around in the consumer space - but that’s by-the-by :slight_smile:

Well, with the topic and all the other discussion before hand, when I say HDR, DV etc - 10 bit or more is implied and higher bandwidth - sorry, didn’t realise this was on the test. :pensive:

I understand metadata fine, from many many many Atmos / DTS:X questions from people.

Yep - assumptions and implications can be dangerous. A lot of people think 10-bit support = HDR support for instance. Or 10-bit decode = 10-bit output = HDR…

Not sure how dangerous, I will call everyone’s parents to be on the safe side.

This thread is about 10bit HDR 4k 4:2:2 hevc .mkv support. Think it was about banding originally. I tend to abbreviate it as ‘4k HDR’ as the pedantic stuff isn’t important.

or dangerous.

Is HDR auto detect still being worked on and ready to be wrapped up ?

^^^ Works perfectly, for me!