[TESTING] Vero 4K / 4K + video improvements

Hi

Yes…I witelisted and I keep 1080p gui.

When I play 4k 60Hz movies when whitelisted there is no video.
When I play 4k 60Hz movies when I disable 4k 60Hz in whitelist - there is video but only in 1080p 60Hz

I am using PS4 PRO where I set 3840x2160p - 60Hz YUV420 without any problems but…

when I’m trying to use 3840x2160p - 60Hz RGB there is no video on PS4 PRO.

Maybe this info could help?

Maybe the video features of your receiver will give people some clues

Video Features
› Ultra HD Pass-through with HDCP 2.2 (4K/60p/4:4:4/24-bit,
4K/24p/4:4:4/36-bit, 4K/60p/4:2:0/36-bit)
› HDR10, HLG and BT.2020 Pass-through Support

The best way to give us the information we need to help you is with debug logs.

Please see How to submit a useful support request - General - OSMC

This issue isn’t fixed for me. I tried all of the combinations of HDR Auto switching on/off, Adjust display refresh rate on/off, I tried these settings with the sept 12 kernel, then I bypassed my receiver and tried directly to my LG C7. I even did a fresh install before all of this. I had to do the rc.local trick to see Blade Runner 2049 not have really bad banding right at the beginning with the smoke/clouds. I’ll try to get screenshots and logs but I just wrestled with this for a couple hours and need a break right now.

Please try the latest kernel from staging (stretch devel).

1 Like

Hi, thanks for this, unfortunately I’m still having issues with banding using this and/or the latest devel kernel.

osmc@osmc:~$ uname -a
Linux osmc 3.14.29-122-osmc #1 SMP Wed Oct 24 17:05:04 UTC 2018 aarch64 GNU/Linux

Vero 4k+ is connected to an LG OLED B8 with deep color enabled on the HDMI input.

HDR autoswitching is off and attr is unset:

osmc@osmc:~$ cat /sys/class/amhdmitx/amhdmitx0/attr

osmc@osmc:~$

Switch display refresh rate is set to Always

When playing back some files I can still see noticeable banding. Although it’s better than it was before testing, it’s still worse than playing the file natively on the TV, which has almost no banding whatsoever.

An example is at around the 00:01:55 mark in The Revenant, where the fade out of the sky has very noticeable banding.

Here’s the mediainfo for the video from the file:

Video
ID                                       : 1
Format                                   : HEVC
Format/Info                              : High Efficiency Video Coding
Commercial name                          : HDR10
Format profile                           : Main 10@L5.1@High
Codec ID                                 : V_MPEGH/ISO/HEVC
Duration                                 : 2 h 36 min
Bit rate                                 : 36.9 Mb/s
Width                                    : 3 840 pixels
Height                                   : 2 160 pixels
Display aspect ratio                     : 16:9
Frame rate mode                          : Constant
Frame rate                               : 23.976 (24000/1001) FPS
Color space                              : YUV
Chroma subsampling                       : 4:2:0 (Type 2)
Bit depth                                : 10 bits
Bits/(Pixel*Frame)                       : 0.186
Stream size                              : 40.3 GiB (89%)
Default                                  : Yes
Forced                                   : No
Color range                              : Limited
Color primaries                          : BT.2020
Transfer characteristics                 : PQ
Matrix coefficients                      : BT.2020 non-constant
Mastering display color primaries        : Display P3
Mastering display luminance              : min: 0.0000 cd/m2, max: 1000 cd/m2

And here’s the config when it’s playing:

osmc@osmc:~$ cat /sys/class/amhdmitx/amhdmitx0/config
cur_VIC: 93
VIC: 93 3840x2160p24hz
Colour depth: 10-bit
Colourspace: YUV444
Colour range: limited
EOTF: HDR10
YCC colour range: limited
PLL clock: 0xc000029a, Vid clock div 0x000b0000
audio config: on
3D config: off

Let me know if I can provide anything else to assist in understanding the issue

Thanks for the report. Just to confirm: when you say ‘still’ I assume you mean you were having similar problems with all previous kernels, ie 121 (with 4:2:2 CS), 119, and earlier kernels using attr to force bitdepth.

Can you check with echo 8bitnow | sudo tee /sys/class/amhdmitx/amhdmitx0/attr to see if that’s even worse. If so, we need to look elsewhere than bitdepth. Maybe back to those dither settings :roll_eyes:

I only received my 4k+ yesterday, so testing so far has been limited (and it should be noted the entire HDR/colour space stuff is new to me). I’m happy to do some structured testing, as i’m keen to resolve the issue and at least get an equivalent picture compared to playing files natively on the TV.

With the stock kernel on the 2018.08-02 release I get more noticeable banding, presumably 8 bit output although the output in /sys/class/amhdmitx/amhdmitx0/config doesn’t contain as much detail as with the latest kernel. Trying to force 444,10bit via attr on the stock kernel results in a “no input” message on the TV and no picture, as does HDR autoswitching (with or without the HDMI input on the LG labelled as “PC”).

I’ll try forcing 8bit later tonight to see if that makes it worse. I’ll be able to do more testing at the weekend if there are specific kernel versions/settings to try which would provide more insight into what’s happening. I can also provide some photos of the display under various configurations if that’s helpful.

But 444,10-bit is working with kernel 122? Albeit with ‘banding’. And I’m going to assume by ‘PC’ you mean ‘PQ’ (ie HDR10).

Thanks for testing. It seems kernel 122 is no worse than previous kernels from what you say. And it should be better in many cases.

I think others had an issue with LG OLEDS but can’t remember exactly what - have a look around the forum.

Yes, based on the config output when playing that file I believe 444,10bit is working on 122 (I get a picture on the TV, in any case :slight_smile: )

“PC” is apparently what you need to set the HDMI input to in order to enable “PC Mode” on the LG, which i’ve read elsewhere on the forum is required to support 444 10 bit output.

Due to the issues i’ve had with getting “no input” messages i’ve tried pretty much every combination I can think of, both on the TV and on OSMC, and the only one which seems to be working without significant banding or no picture at all is kernel 122 and PC Mode on the LG (which still has some banding, but less severe).

Welcome to Wonderland. I found some discuss on LG’s PC mode here
https://www.reddit.com/r/OLED/comments/8ohae4/is_pc_mode_on_lg_completely_fixed/

Some of what is being said about HDMI standards, bandwidths, etc isn’t quite right but it does illustrate what a confused space HDR is, even now.

With attr unset on 122:

With echo 10bitnow | sudo tee /sys/class/amhdmitx/amhdmitx0/attr on 122: [mod edit: should have said 8bitnow]

Is this Leonardo Dicaprio ? :thinking:

Sorry for trolling :blush:

You sure that’s 10bitnow, not 8bitnow??

What is cat /sys/class/amhdmitx/amhdmitx0/config say for each of those?

Any chance of a short clip of that vid around that timestamp?

For the record, I can’t get any device I own to display that scene in The Revenant without banding on my 65" C7 OLED. I think it’s at least in some part an OLED problem, since I have multiple devices that can play the full UHD Blu-ray and they all display banding. I have a Vero 4K plugged into my JVC DLA-RS540 projector, and there’s no visible banding during that Revenant scene. It’s a marvel.

And I’m referring to the current August release, with the 10bit flag enabled:

echo ‘444,10bit’ > /sys/class/amhdmitx/amhdmitx0/attr

I tested with 122 as well. I use an Onkyo TX-NR646 AVR and a LG OLED C8.

It works with 444 10 bit HDR. I see no banding. Only if I force to 8 bit.
I have not set my LG to PC mode, but 444 10 bit still works. My AVR reports 444, 30 bit.

A 60fps clip (The World in HDR 4K) works fine as well. My AVR reports that as 420, 30 bit.

Looking at /sys/class/amhdmitx/amhdmitx0/config in both cases, matches the values of my AVR.

I also tried to enable PC mode to see if I could see a difference. Strange enough in PC mode there is banding visible. It is less than when forced to 8 bit however. It could be that processing on the tv fixes some banding. But I have disabled most processing on the tv. That’s confusing. I disabled PC mode on the tv for now, because that gives the best image quality.

For reference I watched the same clip directly on the TV via the plex app. That gives the best image quality. Even better than with the Vero with PC mode off. So it seems the Vero 4K+ introduces some banding.

Does this make sense to anyone?

Gah, no, that’s me trying to multitask unsuccessfully, I meant 8bitnow, apologies for the confusion.

Have you tried to watch it directly on the TV (via Plex e.g. or USB)

Perfect sense, but vero can’t ‘introduce some banding’. It could be the TV uses a different dithering algo on HDMI signals than it does when playing from USB or DLNA. As I mentioned above, we could fiddle some more with the dithering settings on vero but IIRC that created a loooong thread in here last time it was discussed.