Well, depends on the display.
The static meta data of HDR (MaxCLL/MaxFALL are just parts of it) MAY be used to control color volume transformation to get closer to the intend of the mastering displays.
There is also ST2086 in the meta data which defined the three primaries, white point and min/max luminance of the mastering display.
As displays are so different what they actually can display a lot of “black magic” (read 3D LUTs) are happening on the player and or display size. Or none on the lower end…
In any case if your display’s capabilities are below a streams capabilities some color transformation needs to take place. If a display is beyond it, it just doesn’t care. Most manufactures do their own stream processing anyway based on their analysis of the picture as the meta data is too generic anyway. Its more of a “hey use preset X” kinda thing.
Consumer HDR displays have varying peak brightness, black levels and color gamut. So it is up to the display or players to map the color volume of the stream to the displays as good as possible.
If this is not happening the stream, parts of a frame may be too dark or too bright compared to the mastering display. But usually you would barely notice it unless someone entered some extremes for a stream.
Your display will display the colors wrong in any case compared to the mastering display. But that is expected, we talk consumer devices here. Same as in the SDR world: some are good, some are worse, but non consumer device ever was capable of displaying a frame as a professional display.
Note that decoding this metadata is optional as per specs for Phase A devices. Similar for rendering. Phase A devices need to be able to process HDR10, either with or without meta data.
As said most displays don’t care and manufacturers have their own stuff implemented across their lines depending on what displays they used.
Just don’t get to hung up on it. The Meta Data is only important if you have a top shelf displays that you calibrated and want it to get as close to the mastering displays as possible. It’s just icing on the cake.
In reality all displays do a ton of post processing as the displays are not even close to HDR10 specs - there is no standard here beside accepting the signal.
Projectors might be more sensitive to this information as they require adjustments in the peaks due to the nature of being a projector and use that meta data for their own color transformation.
But sure, passing the data through in case a display uses it can never hurt, but the specs don’t even require it - and having it will be no magic pill. Your display has the same limitations as before. Might kick in some other processing though to cover up the weaknesses.