Vero 4K HDMI colour space output

Yep - though to be honest, unless you are at the very high-end, HDR10 and HLG are probably enough to cover every source you will find at a good quality.

If you are a high-end consumer - you probably don’t find the current situation that chaotic.

Yep - I think the issue is that you can’t just offer one HDR->SDR option though - that was the original triggering point for the discussion. Because HDR material is being graded so differently - the HDR->SDR conversion you will want to use is likely to vary title-by-title if you care about image quality.

If I’m honest I couldn’t find a single HDR->SDR setting that really satisfied me on my Sony UHD Blu-ray player.

Personally I wish we’d just gone HLG in the domestic sphere. PQ makes little sense in the varying light levels of modern TV viewing conditions (even less so on phones and tablets for goodness’ sake). From talking to experienced colourists they say there’s very little difference in the same material displayed HDR10 and HLG on properly lined up grading monitors. Almost impossible to tell them apart. Except the HLG looks much better on an SDR display, and doesn’t need any conversion …

If I compare to the SDR Blu-ray time it’s pretty chaotic, to be honest :joy: And yep, if possible, I’d like to have the best… But that’s what I meant: HDR10, HDR10+, DV and HLG (and isn’t there also Technicolor HDR) support in one screen and the whole setup chain doesn’t seem to be something the industry is aiming for. That’s pretty annoying and frustrating to me as a customer.

Yep, the best possible would be nice. If it’s possible without too much hassle. Which would mean for me: enough profiles to get decent results and not too many so that users are not confused by the sheer amount. Don’t know, if the results would be satisfying, but Sam said he’d at least like to aim for more pleasing results. :+1:t2:

But everything you, as I understand, as a very experienced user/pro, are telling us, makes it obvious how f****d-up this whole HDR mess is. Why have so many different formats that produce so many problems (conversion to SDR is just one of them, all the firmware update issues and incompatibilities between devices) and even if everything seems to work there are so many reports out there of UHD HDR Blu-ray movies just not looking right on a dedicated UHD HDR panel? :joy: Especially if you have standards that allow for 4,000/10,000 nits brightness that no consumer panel can reach at the moment and therefore even those standardized HDR format movies have to be adjusted to every screen on the fly. This all doesn’t sound like a straightforward approach to me.
The industry seems to have lost track of what it actually wants… That’s my impression (not having full insight into all the details).

Yep - HLG would have been pretty much good enough for most consumers.

PQ is not - IMO - a good approach for consumer viewing. Dolby, as ever, want to sell Dolby devices. At every point in the DV chain you need something with a Dolby logo on it… (It also makes sense for cinema grading. I don’t have a cinema…)

It’s not just HDR->SDR…it’s HDR->HDR that also doesn’t always seem to work right without tweaking either the source or the display.

This is because HDR is a hack (and worse, a hack with multiple standards). Hue, saturation, and lightness or RGB are all it takes to describe a color, but instead of picking a color space that includes all the colors that can be displayed on a HDR display, instead Rec.2020 was picked, and then augmented with HDR information to alter the color being displayed.

If a mythical “full HDR” color space had been used instead, then just like converting between any two color spaces, simple transforms from “full HDR” to Rec.2020, Rec.709, etc, could have been defined that would result in accurate colors on any display from a 4K source.

One reason I’m vocal about this is that I spent a lot of money on a top-of-the-line 1080p display, and I don’t want to spend another $3-5K to replace it just 5 years later. Maybe it’s not such a big deal for people who think a $500 55" UHD LED has a great picture.

That I totally agree with!

I’m not sure your point - are you arguing that Rec 2020 chose the wrong primaries? The primaries in the Rec 2020 spec are a lot wider gamut than those used for Rec 709?

Or are you arguing against the use of non-constant luminance?

What’s is the issue you have with the Rec 2020 primaries and RGB->YCbCr matrix - which are the two main things defined in Rec 2020 ?

Eh? You’re conflating colour space with colour gamut and EOTF now aren’t you? (I’m guilty of conflating colour space with colour gamut I admit)

Rec 709 didn’t really have a defined EOTF - though BT.1886 defined one. (Until recently the EOTF of TV had been assumed to be roughly that of a CRT so wasn’t really that defined.)

Rec. 2020 was originally an SDR colour gamut too, and had a similar EOTF (I think)

Conversion between the two was non-trivial as their RGB primaries are significantly different, and you have to decide how you convert Rec 2020 colours that are out of gamut in Rec 709 to Rec 709 values. You take a decision and implement it though. This is the difference between Wide Gamut colour and regular colour gamut you could say. Nothing to do with HDR or SDR. It’s a conversion between gamut - not just space. (Colour space is usually coping with different YCbCr to RGB matrices - like Rec 601 and Rec 709 - but with similar primaries). Rec 2020 and Rec 709 have both different RGB <->YCbCr matrices - they also have different primaries, and thus very different gamuts.

Colour space conversion = easy. Colour gamut conversion = not so easy.

However once you start having to cope with EOTFs - which are the differentiator between HDR and SDR - you enter a whole new world of hurt.

There are different approaches to EOTFs - the mapping between video signal levels and emitted light from a screen. This isn’t to do with colour - it’s to do with light level generation, and thus the dynamic range of the light emitted from a screen (i.e. how bright the RGB sub-pixels go). Rec 2020 defines what the ideal colours of these sub-pixels are, the EOTF defines how bright they go for a given input signal.

Rec 2100 defines 2 main EOTFs - PQ (which is based on ST.2084) and HLG. They are two totally different approaches to handling a higher dynamic range signal.

HLG is effectively BT.1886 (i.e. the SDR EOTF once we had one) for a large part, but then rolls off into a Log curve for the highlights. On SDR displays this rolls off, on HDR displays it’s unrolled and generates highlight content.

PQ is ST.2084 and dictates a specific nits light-level output for a specific input value. This value is decided by the colourist when they grade the source material, and is an artistic/creative decision. ST.2084 accurately preserves this artistic decision - but we don’t watch in colourist environments (which is why I feel ST.2084 PQ is a bit flawed for domestic use - though makes absolute sense for a cinema)

What mythical ‘Full HDR’ colour space do you think we are missing? Do you think we should have gone for a linear light representation? Avoided a log-base EOTF?

Whatever HDR space you use you have to define a way of downconverting to SDR - and that heavily depends on your EOTF approach doesn’t it?

HLG is relative, PQ is absolute. Relative is a lot easier to cope with… PQ downconversion to SDR needs to know what the colourist was thinking when they graded the pictures…

Well, anyways… :stuck_out_tongue_winking_eye::rofl:

What does @sam_nazarko say about all this? What changes might be possible and more importantly: doable?

And yet, we watched color pictures for 60 years with no need to mess with the “light level” any more than what was encoded into the signal.

Once digital came along, content was coded as YCbCr and that included all you need to know about the “light level” of each pixel. So, what is it about the encoding of pixels in 4K video that makes it impossible to include the actual light level for each pixel? The answer is…nothing, in theory anyway. But, something must be “missing” if out-of-band HDR information is required to make the picture look “right”.

And, this is what’s broken. There should be no “grading” required. The color space/color gamut/brightness/etc. for 4K should have been defined in a way that there are fixed minimums and maximums, and then every pixel would be coded in such a way that says “I’'m at 48% red, 32% green, 12% blue, and 29% brightness” (if you can’t see this is a very generalized example, then just don’t bother to respond). Bingo, no need for any out-of-band HDR, no need for PQ, etc. The display would then display the pixel as requested.

This is pretty much exactly how non-HDR, non Rec.2020 digital video works now. Sure, there’s conversion required because the video is encoded via YCbCr instead of RGB, but how to do that is well defined, and can be well-defined for any color space/gamut.

Last, HDR is a gimmick because the source almost never has that much dynamic range…it’s just being added after the fact, and it’s not accurate to the source. It’s no different from such debacles as recoloring The French Connection.

For years, audio and video enthusiasts have tried to have a reproduction that is as close as possible to the original film source, and now those same people are embracing artificial extra brightness. For years, it’s been known that a brighter picture and louder sound are perceived as “better” by people who aren’t interested in accurate reproduction. And, so, now we are all going to suffer with inaccurate pictures because 4K TVs weren’t selling fast enough, and movie studios knew that the triple or quadruple dip wouldn’t fly unless they had something that grabbed people even more than UHD.

Can we focus on the topic at hand again… And that’s not the pros and cons of HDR in general, but color space output from the Vero 4k. :wink:

Let’s wait for Sam to give a comment on some of the suggestions/requests made here since he last replied.

I read, but don’t comment until there are things to test.

Sam

Ok… Waiting patiently then until that happens :slightly_smiling_face::+1:t2:

I hope I haven’t given the wrong impression. We can add some presets but from my perspective there will be limited support of HDR on SDR displays. There’s enough to be getting on with regarding HDR alone.

Maybe we can put this to the community and give them a few command line options and brief documentation and gather a general census of optimal defaults.

Sam

We didn’t ’ need to mess’ but what sort of picture quality were we getting 60 years ago? Oh, but we did have brightness and contrast knobs IIRC.

If you want to see a wide colour gamut, wide dynamic range, perfect colour grading and no compression artifacts just look out of the window. Then tell us how to squeeze all that information down a wire and on to a variety of display devices so it looks realistic.

That’s what I also understood from your comments before… Not to expect magic to happen, but that you offered to look into ways to improve conversion, if possible.

Certainly sounds like a very good idea. If people work with this opportunity constructively, it might give a variety of profiles that could cover various screens e.g. and could therefore satisfy the needs of many. And it might save you time as the testing can be done by others on a greater variety of equipment than you guys probably have at your disposal. :+1:t2:

It’s pretty easy to squeeze it down onto a wire, given enough bandwidth. After that, it’s the job of the display to reproduce it correctly. The video encoding shouldn’t care about anything but a mythical perfect display. As for looking realistic, I’ve got that in my Panasonic plasma.

But, you also have to take into account that the final output just needs to represent the original source, not reproduce it perfectly. There is no need for 10,000 nits from a home display, because you don’t have to have it visible from 3 miles away, and it doesn’t have to give you a suntan when showing a picture of the sun…the sun just has to look like the brightest object on the display.

Yes - and no. To be fair we had hugely compromised picture quality. I’ve spent 25 years working in TV production, and have spent my life battling blown-out highlights (or odd looking knees), lost-low lights, and the results have always been heavy compromises. We’ve increased the resolution of our pictures, and hugely increased the quality of our cameras, but our handling of radiometric resolution and dynamic range has been an issue.

Not really - it was the luminance value of the pixel. With out an EOTF (or an assumed one) it didn’t tell you the light level (you inferred it based on an assumed EOTF - originally based on CRT gamma)

Nothing - apart from bandwidth and the question of whether that is the right thing to do. We certainly aren’t in the situation where we want to sample the original light level and repeat it - that would be pointless.

It’s probably only needed to make HDR pictures look right in SDR, just as a skilled vision engineer (for live) or colourist (for pre-recorded) handles the way current SDR cameras generate pictures. For HDR on HDR displays it just gives the display a bit of guidance which can help with things like local dimming etc

HDR means fewer compromises (you don’t have to ‘rack’ cameras as hard between shadow and sunshine on a sports pitch for instance, nor play with knees to keep faces and the sky looking right without one being too dark or the other burned out)

HDR metadata is there to help HDR displays deliver a better end result by telling them what to expect within a given programme (HDR10) or scene/shot (DV + HDR10+). This lets them drive their displays more effectively knowing where the dynamic range in the scene is. It may also help HDR->SDR conversion,.

I think we may be using the word ‘grading’ to mean different things. You can’t not grade pictures from modern cameras like Alexa, Amira etc. - particularly if you are shooting Log - they look horrific. (If you shoot Rec 709 you are throwing away huge amounts of what you use those cameras for in the first place). We’ve graded pictures since colour television began - whether in the telecine suite or in the live control room.

Sure - live video cameras can shoot live, and they don’t go through a grade, but their racks operators (vision operators in the US) are doing a lot of camera control - often shot-by-shot, in SDR to make the pictures look right. If you leave everything on auto - they look awful.

Just go from outside on a cold wintery day to inside in a tungsten lit room with no grading or live vision operation to see why you need grading…

There is always going to need to be a grade between the footage from a movie/drama camera and the finished result. That’s what a colourist does. They take what looks a flat image - often containing more dynamic range than can be displayed - and map it into the SDR or HDR domain in a pleasing way. They decide on the colour balance, the contrast, the ‘pop’ of a shot - in association with the director and DoP. You can grade in lots of different ways, but you can’t not grade stuff.

Hmm - why are you separating brightness from RGB? Doesn’t that make an impossible system?

What does 24% red, 16% green and 6% blue at 29% brightness look like in comparison to your example?

Is that different to 48% red, 32% green and 12% blue at 14.5% brightness?

I’m trying to understand an RGB intensity based system where the brightness is independent of the RGB space. Is the brightness somehow altering saturation? What point does it perform that the RGB numbers don’t?

PQ is purely a choice that has been taken - largely driven (I think) by the movie industry - to allow them to guarantee consistency in movie theatres, and this was picked up on as a way of guaranteeing the home cinema experience.

Personally I don’t think PQ is that good a fit for home viewing.

HLG makes a lot more sense - and the EOTF used is a good compromise. Whatever system you use has an EOTF - you just chose one that makes sense.

HLG is scene-referred - just like SDR SD and HD TV - and for me makes a lot more sense for home viewing where your ambient light levels vary wildly.

In some cases I totally agree - and for drama I think other than a few speculars looking bright and shiny the case hasn’t been made. However I’ve worked in TV long enough to know that natural history, and event stuff, particularly concerts, really do benefit from it.

1 Like

Any update on color space switching for 10bit files? Was hoping it would make it into the February update but alas.

Not yet. It’s a way off yet