(Paging @grahamh for this one, as he’s been involved in related stuff…).
This is going to be a difficult one to explain!
There was a lot of work done on VC-1 playback on the Vero 4K in the new release. As people will no doubt recall, there is an issue still outstanding for 1080i/50 VC-1 videos where, if the output display mode is 1080p/50Hz and either Deinterlacing is set to Off, or it is set to Auto-Select and the video is frame-interlace, you sometimes get stutter on the output.
We’ve established this is not going to be fixed any time soon; fair enough! But there is a work-around which is sometimes useful: set Deinterlacing to Off, and set the output to 1080p/25Hz, and (as far as I know) there’s never any stutter.
Now, if we go back in time a year or so, setting Deinterlace to Off on a 1080i/50 VC-1 video would automatically cause the output to switch to 1080p/25Hz after a few seconds; and if you stopped and restarted, the video would then start playing at 1080p/25Hz each time. This behaviour was altered because of some issues with 1080i/50 h.264 videos: if one of those starts playing at 25Hz you get all kinds of problems. To avoid that, there was a change made so that 1080p/25Hz is not considered to be a valid default output mode for 1080i/50, even if it’s whitelisted.
So, for a 1080i/50 VC-1 video with Deinterlacing set to Off, you have to select 1080p/25Hz manually; if you stop and restart, you have to re-select 1080p/25Hz each time, because output mode isn’t stored for the video; the mode switch takes a long time (video freezes for ten seconds or more afterwards); and I’ve even seen it spontaneously switch back to 1080p/50Hz occasionally.
What I’m wondering is, could this logic be made codec-sensitive, so that 1080i/50 h.264 (and MPEG2) videos continue to ignore 1080p/25Hz as an option, but for 1080i/50 VC-1 (with Deinterlacing set to Off) it becomes viable again?