Yes, Tom Cruise is right, motion interpolation is evil; but a 25fps video played at 50Hz isn’t using motion interpolation, it’s simply playing each frame twice.
Given that the great majority of TVs and projectors nowadays uses a sample-and-hold approach - each picture is displayed steadily for the duration of the frame, and then it switches almost-instantaneously to the next picture - there can’t be any visible difference between 25Hz and 50Hz output.
So, unless the 50Hz signal is one that the TV’s HDMI inputs can’t handle, I don’t see why playing 25fps at 50Hz would be a problem. Can someone enlighten me?
The Hobbit films were shot at 48fps, and TV shows are sometimes shot at 50 or 60 frames per second (or, more precisely, 50 or 60 interlaced fields per second which are deinterlaced to 50 or 60 frames at playback).
There is a world of difference between something that consists of 48, 50 or 60 unique frames per second, and something which consists of 24 or 25 frames per second with each frame shown twice. The latter is what this thread was originally about (before entirely off-topic discussions about motion interpolation and high-frame-rate sources crept in).