For many years, I've felt sorry for those regions that have to deal with, what are to me, inexplicably different frame rates like 29.97, 23.976, 59.94, 30 and 24 fps. How can a frame miss .03, .06 or .024? Going back to analogue TV, I took it that NTSC (especially in the USA) was 30 fps because it related to the power supply being 60 Hz (cps) whereas PAL (in UK, AU, NZ for example - and Germany where PAL was developed) was 25 fps because the power supply was 50 Hz (cps).
Though I haven't investigated this too much, I'd love to know why there are all these different frame rates in what I suppose to be NTSC regions. If anybody has a link to an overview of how this came about, I'd be most appreciative.
In PAL land, it's so much less complicated with the choice being 25 and 50 fps
________________________________________
I posted this as a comment to:
... but I now realise that it probably hijacked the original post, so I've now posted it as a new topic.
So far, the following comments have been posted:
lenard-p wrote on 5/10/2020, 10:08 PM
from memory pal has higher video bandwidth than ntsc. When color was introduced ntsc with it's lower bandwidth had to compromise and needed to reduce fps to add color info, but pal did not have to compromise so remained at 25fps
and
Dexcon wrote on 5/10/2020, 10:24 PM
Thanks @lenard-p … you've jogged my memory a bit. I vaguely recall from the 70s that there was also a technical difference - vertical sync? or something like that - that made PAL more reliable color-wise than NTSC. Although NTSC is the initialisation for the National Television Standards Committee, it was sometimes ironically referred to in the 70s as "Never Twice the Same Color" as I recal