Any idea why they convert 24p film to the 23.976 framerate?
I imagine that allows studios to scan their movies once in HD, then downconvert them to DVD, instead of having to scan them twice, once at 24p, once at 23.xxxp.
DVDs, apparently, are going to be around as long as BDs cost more. The average person not involved in any kind of visual arts (painting, graphics, photography, and of course film and video) has not trained his vision to discern all the subtleties that are obvious to us. As a result, many people claim, and are quite serious about it, that BD does not look any better than DVD.
Add to it that many consumer TV sets claiming to be HD do not support the full 1080i/p resolution. They can receive the HD signal but downconvert it to a lower resolution before displaying it. And that is another reason the average consumer cannot tell the difference between DVD and BD.
At any rate, as long as DVDs are still popular, the studios probably just save money by scanning to 23.xxx and using the result for both BD at 23.xxx and DVD at 59.xxxi or whatever the DVD standard is.
Though I really do not understand how you can physically scan a 24p film into that 23.xxx rate. I have always assumed they actually scanned it to 24p, marked the result 23.xxx and resampled the sound accordingly (which would be much easier to do that to resample the video).
But apparently, they still want to just deal with one frame rate for both (even if they have to apply the pull-down or pull-up or whatever to convert 23.9xx to 59.9xx, but that, too, is easy).
Many 24p productions, especially those that are made only for TV and video distribution, actually have a frame rate of 24 * 29.97 / 30 frame/s, or 23.976frame/s (24/1.001 to be exact). Many use the term "24p" as a shorthand for this frame rate, since "23.976" does not roll off the tongue as easily. This is because the "30frame/s" framerate of NTSC is actually 30/100.1%, also referred to as 29.97frame/s – this framerate is matched when video at 23.976frame/s has a 3:2 pulldown applied. Similarly, 60i is shorthand for 60/100.1% fields per second.
Film productions may be shot at exactly 24.000 frame/s. This can be a source of confusion and technical difficulties if material is treated as normal video, since the slightly differing framerates can be problematic for video and audio sync. However, this is not a problem if the video material is merely treated as a carrier for material which is known by the editing system to be "true" 24frame/s, and audio is recorded separately from moving images, as is normal film practice.
"Any idea why they convert 24p film to the 23.976 framerate? I wouldn't have expected that..."
Film is generally scanned as discrete frames to produce an image sequence so frame rate from one aspect doesn't make that much sense. By convention it is played back at 24.000 fps in cinemas either as a print or digitally.
When broadcast in Region60 the frame rate changes to 23.976 for reasons explained above. When broadcast in Region50 it is played out at 25.000 fps.
If the movie is being shot digitally its not unheard of to shoot at 25fps to avoid flicker problems in Region50 countries from iron ballasted light sources. Once post has produced a 25fps master its very simpe these days to produce a 24.000fps DCI complaint master for projection or a print. From the same digital master a 23,976 fps master can also be produced. The only place where things can come unstuck is if in post someone "assumes" the frame rate. I have heard of one local production that started shooting 25p and half way through decided to switch to 24p which in reality was 23.976p. Visually it makes no difference but the potential for major dramas with audio sync are considerable.