10 bit and banding fixed?

For most displays: yes.

This has been covered in more detail in the numerous other threads re. banding


No worries. Tag me in a post when you’re back at it.

Thanks for helping

I tried the hotfix and I can’t see any difference.

According to the manual my TV the KS7000 series support 10 bit 4:2:0

I’ve got the 55" KS7000 and the, ‘ebuhuyajej’ hotfix works great for me with a Marantz SR7011 AVR between the KS7000 and the Vero 4k.
Make sure ‘Enable HDR autoswitching’ is unchecked in system settings in Kodi and Check you’ve enabled ‘HDMI UHD Color’ for the correct input in Picture/Expert Settings on the Sammy.
Might be telling you stuff you already know but doesn’t hurt to check.

I forgot to remove the hdr auto enabling :slight_smile:

It seems to be working now :slight_smile: it certainly looks better and no banding.

1 Like

Glad to hear this.

@sam_nazarko Hi, I know you’re on a break but here’s a few things I found in the Shield that do work and don’t and how it loosely ties into the vero and my processor.

I can set my Shield to 12bit 4K HDR 24hz - this will read on my PJ as 4K HDR with BT2020 YUV colour.

Since the board tops out at 10.2Gbps it can’t be 12bit, it must be all the above except it’s downgrading to 10bit?

To get all this working you have to set the Shields HDMI setting to those parameters at the GUI level, then enable the auto refresh rate in Plex - which you should leave on all the time, which would be fine except that if you go back to something in SDR regardless of how the GUI is set it will output 8bit RGB - which looks very very colourful and contrasty, so to get YUV 8bit you have to disable auto refresh in Plex.

Why Plex? - it’s really good on the Shield and much snappier than Kodi - but i’m pretty sure that it probably won’t be updated or the API won’t allow it… I dunno, I’m not a programmer.

So if I can get 4K HDR YUV 10bit 4:2:2 24hz (when detected) and then not run into the switch back problem for SDR content which I’m guessing is 50 or 60hz 8bit YUV REC 709 then that would be great.

This is all probably well known but i wanted to give you all the info at my end in case some of it is pertinent to making the Vero a player of what you play is what you get, which doesn’t seem possible on the shield without manual adjustment. I’m not even sure the ATV can do this or the new Dune.



Hi @Rock_Danger

I’m still on a break for a few days but will be back working on things again soon.

My line of thinking is indeed that for your specific setup, you’d want 10-bit 4:2:2 @ 24Hz when playing Rec 2020 content, and 8-bit@24Hz for everything else (i.e. Rec.709). So the plan is to work on resolving issues with the 4:2:2 output pipeline and then implement this auto-switching in to Kodi based on a display’s reported capabilities. Your equipment does successfully report its limitations; so it’s easy to discern what we should be sending to it.



Hey Sam,

Yep i think the chroma SS is correct and the 10.2Gbps boards can’t handle anything above the aforementioned modes.

If you look at the biggest complaint out of all these media boxes it’s usually it won’t auto switch resolution and colourspace correctly - but a blu ray player will every time. I wonder what makes it so difficult at the developer level UHD has been standardized for a good while now.

Okay: the reason we don’t switch to 4:2:2 yet is because there is a problem with this. Once that is resolved we will add support for it.


Look forward to testing it. It’ll be nice to have a media player that just works.

I’ve seen several people that have tested the Shields 12bit 4:2:2 output say that its actually 8bit. That probably makes sense as the banding looks the same as when it is set to 8 bit mode and none of my equipment will report the bit depth when its in 12bit mode.

My Shield TV outputs 2160p50 Rec 2020 4:2:2 12 bit according to my HD Fury Vertex. I haven’t run test material through it yet to confirm whether this is just 8 bit video with 4lsbs of padding or 10 bit with 2lsbs etc. but the output format is definitely being reported as 2160p40 4:2:2 12 bit.

The same is true if I manually select 2160p23.976 Rec 2020 4:2:2 12 bit - that’s the format that is output.

I’m running through a Denon X2400H AVR into a Sony XF9005 TV.

Well I had a somewhat heated debate elsewhere about this.

4K 12bit HDR BT2020 cannot be supported on 10.2Gbps I believe it needs 13+ indexed here from Murideo.

As @Steve_Neal rightly says, if you alter the GUI it’ll output that, however I’m not convinced it’s above 8bit but it is doing something different that the 1080p version of the movie.

None of this matters with an 18gbps chipset, but for those of us (for now) with HDMI Lite we need that UHD Blu ray player emu of 4:2:2 10bit to get it past.


Based on my extensive testing with reference grade HDMI analyzers, waveform monitors and several other tests, I can say with absolute confidence that the output of Shield in 12-bit 4:2:2 mode is true 10-bit from 10-bit source contents.

The 8-bit color depth display issue with AVRs, projectors, many HDMI analyzers is due to an entirely different reason. AVRs and other devices report the color depth from the CD bits set in the HDMI InfoFrame. By design, YCbCr 4:2:2 doesn’t use those bits. The CD bits are therefore set to default value - 8-bit.

You can easily verify whether your media player, display etc. has true 10-bit processing or not with the Quants2D test patterns (https://test.full.band/).

1 Like

I think it reports up to 12-bit in the GUI program. The HDFury rep. has explained the reason for this: it is beyond the scope their FPGA to analyze the HDMI signal and determine the actual bit depth.

The bandwidth required for 12-bit 4:2:2 is the same as 8-bit 4:4:4. In other words, the bandwidth for 4K 12-bit 4:2:2 is the same as 4K 8-bit 4:4:4 and for 24Hz it is only 8.9Gbps.

12bit with HDR tho? I suspected it would be falling back to 10bit. i’m not a 100% convinced what i’m looking at is HDR even tho the picture is a bit better.

HDR or SDR is just an EOTF flag (and associated metadata if required) in the HDMI stream isn’t it? The actual video signals are the same either way aren’t they? (i.e. the only difference between a 2160p60 4:2:2 12-bit HDR and SDR video signal on an HDMI cable is the presence or absence of HDR EOTF metadata?)