Why so many different frame rates with NTSC

Dexcon wrote on 5/10/2020, 7:36 AM

For many years, I've felt sorry for those regions that have to deal with, what are to me, inexplicably different frame rates like 29.97, 23.976, 59.94, 30 and 24 fps. How can a frame miss .03, .06 or .024? Going back to analogue TV, I took it that NTSC (especially in the USA) was 30 fps because it related to the power supply being 60 Hz (cps) whereas PAL (in UK, AU, NZ for example - and Germany where PAL was developed) was 25 fps because the power supply was 50 Hz (cps).

Though I haven't investigated this too much, I'd love to know why there are all these different frame rates in what I suppose to be NTSC regions. If anybody has a link to an overview of how this came about, I'd be most appreciative.

In PAL land, it's so much less complicated with the choice being 25 and 50 fps

________________________________________

I posted this as a comment to:

https://www.vegascreativesoftware.info/us/forum/vp17-can-you-trust-the-match-media-video-settings--120614/#ca751688

... but I now realise that it probably hijacked the original post, so I've now posted it as a new topic.

So far, the following comments have been posted:

lenard-p wrote on 5/10/2020, 10:08 PM

from memory pal has higher video bandwidth than ntsc. When color was introduced ntsc with it's lower bandwidth had to compromise and needed to reduce fps to add color info, but pal did not have to compromise so remained at 25fps

and

Dexcon wrote on 5/10/2020, 10:24 PM

Thanks @lenard-p … you've jogged my memory a bit. I vaguely recall from the 70s that there was also a technical difference - vertical sync? or something like that - that made PAL more reliable color-wise than NTSC. Although NTSC is the initialisation for the National Television Standards Committee, it was sometimes ironically referred to in the 70s as "Never Twice the Same Color" as I recal

 

Cameras: Sony FDR-AX100E; GoPro Hero 11 Black Creator Edition

Installed: Vegas Pro 15, 16, 17, 18, 19, 20, 21 & 22, HitFilm Pro 2021.3, DaVinci Resolve Studio 19.0.3, BCC 2025, Mocha Pro 2025.0, NBFX TotalFX 7, Neat NR, DVD Architect 6.0, MAGIX Travel Maps, Sound Forge Pro 16, SpectraLayers Pro 11, iZotope RX11 Advanced and many other iZ plugins, Vegasaur 4.0

Windows 11

Dell Alienware Aurora 11:

10th Gen Intel i9 10900KF - 10 cores (20 threads) - 3.7 to 5.3 GHz

NVIDIA GeForce RTX 2080 SUPER 8GB GDDR6 - liquid cooled

64GB RAM - Dual Channel HyperX FURY DDR4 XMP at 3200MHz

C drive: 2TB Samsung 990 PCIe 4.0 NVMe M.2 PCIe SSD

D: drive: 4TB Samsung 870 SATA SSD (used for media for editing current projects)

E: drive: 2TB Samsung 870 SATA SSD

F: drive: 6TB WD 7200 rpm Black HDD 3.5"

Dell Ultrasharp 32" 4K Color Calibrated Monitor

 

LAPTOP:

Dell Inspiron 5310 EVO 13.3"

i5-11320H CPU

C Drive: 1TB Corsair Gen4 NVMe M.2 2230 SSD (upgraded from the original 500 GB SSD)

Monitor is 2560 x 1600 @ 60 Hz

Comments

Former user wrote on 5/10/2020, 7:45 AM

When TV was B&W, NTSC was 30fps. When color was introduced it had to be slowed down to 29.97fps to allow for the additional color information. 23.97fps and 24fps are attempts to replicate the look of film. Feature films were actually projected on TV at 29.97fps which is where the term PULLDOWN originated. Frames had to be duplicated to achieve 29.97fps from 24fps. Normally feature films were shot at 24fps in order to use the least amount of film but still be watchable (minimum flicker). When High Definition was introduced, in order to differentiate from SD, they called it 59.97 referring to fields per second instead of frames (each frame up to this point consisted of two fields(interlaced)). But it was still 29.97 frames per second. Now with progressive scanning, 59.97 can also refer to frames per second where each frame is whole, not two fields. So you have 59.97i (29.97fps) and 59.97p. Clear as mud, right. With digital TVs able to run at pretty much any scan rate, then frame rate has become less critical. Originally TVs could only run at the specific frame rate for the region so everything had to to converted to 29.97fps for NTSC TVs.

The joke was always NTSC was an acronym for the Never The Same Color. I guess it is also never the same frame rate.

 

j-v wrote on 5/10/2020, 7:54 AM

My following question: Why does an international firm as Magix Vegas still start all defaults templates with those silly NTSC framerates and even makes the latest VMS 17 sample project with unusal dimensions and 30fps?

met vriendelijke groet
Marten

Camera : Pan X900, GoPro Hero7 Hero Black, DJI Osmo Pocket, Samsung Galaxy A8
Desktop :MB Gigabyte Z390M, W11 home version 24H2, i7 9700 4.7Ghz,16 DDR4 GB RAM, Gef. GTX 1660 Ti with driver
566.14 Studiodriver and Intel HD graphics 630 with driver 31.0.101.2130
Laptop  :Asus ROG Str G712L, W11 home version 23H2, CPU i7-10875H, 16 GB RAM, NVIDIA GeForce RTX 2070 with Studiodriver 576.02 and Intel UHD Graphics 630 with driver 31.0.101.2130
Vegas software: VP 10 to 22 and VMS(pl) 10,12 to 17.
TV      :LG 4K 55EG960V

My slogan is: BE OR BECOME A STEM CELL DONOR!!! (because it saved my life in 2016)

 

Dexcon wrote on 5/10/2020, 8:00 AM

Thank you @Former user … before posting, I did some calculator calculations and could not get 29.97 with any comparison of 24 fps (film) to NTSC's 30 fps. I understand interlaced and progressive, but I just cannot get my head around a missing fraction of a frame like .03. You got it right with:

Clear as mud, right

Cameras: Sony FDR-AX100E; GoPro Hero 11 Black Creator Edition

Installed: Vegas Pro 15, 16, 17, 18, 19, 20, 21 & 22, HitFilm Pro 2021.3, DaVinci Resolve Studio 19.0.3, BCC 2025, Mocha Pro 2025.0, NBFX TotalFX 7, Neat NR, DVD Architect 6.0, MAGIX Travel Maps, Sound Forge Pro 16, SpectraLayers Pro 11, iZotope RX11 Advanced and many other iZ plugins, Vegasaur 4.0

Windows 11

Dell Alienware Aurora 11:

10th Gen Intel i9 10900KF - 10 cores (20 threads) - 3.7 to 5.3 GHz

NVIDIA GeForce RTX 2080 SUPER 8GB GDDR6 - liquid cooled

64GB RAM - Dual Channel HyperX FURY DDR4 XMP at 3200MHz

C drive: 2TB Samsung 990 PCIe 4.0 NVMe M.2 PCIe SSD

D: drive: 4TB Samsung 870 SATA SSD (used for media for editing current projects)

E: drive: 2TB Samsung 870 SATA SSD

F: drive: 6TB WD 7200 rpm Black HDD 3.5"

Dell Ultrasharp 32" 4K Color Calibrated Monitor

 

LAPTOP:

Dell Inspiron 5310 EVO 13.3"

i5-11320H CPU

C Drive: 1TB Corsair Gen4 NVMe M.2 2230 SSD (upgraded from the original 500 GB SSD)

Monitor is 2560 x 1600 @ 60 Hz

Former user wrote on 5/10/2020, 8:06 AM

29.97 is the speed after they enter the color information in the vertical blanking. Vertical blanking is the interval between when the last line of a field was drawn on the screen and the first line of the next field. TV was originally drawn as lines, not pixels. So a field was drawn from top to bottom, then switched off, and then the next field was drawn. Two fields created a frame so the first field would draw even lines, the second field would draw the odd numbered lines. In between the color information was sent. I don't think there was specific math, they just tried to do it as fast as they could thus it ended up being 29.97. Persistence of vision (not the technical term, but how I think of it) (created during the manufacturing of the TV tube glass) allowed the image to be retained on the screen until the next field was drawn so flickering was minimal. I have a hard time watching PAL TV because I can see the flicker.

Dexcon wrote on 5/10/2020, 8:11 AM

@j-v ... +1. I am in PAL land, and if I start a new project in Vegas Pro 4K in PAL, I have to create a custom setting because Vegas Pro does not think that PAL 4K 25 fps is a common setting - it assumes that PAL 4K is 29.97 fps. Totally weird - and its done this with VP14-17.

Cameras: Sony FDR-AX100E; GoPro Hero 11 Black Creator Edition

Installed: Vegas Pro 15, 16, 17, 18, 19, 20, 21 & 22, HitFilm Pro 2021.3, DaVinci Resolve Studio 19.0.3, BCC 2025, Mocha Pro 2025.0, NBFX TotalFX 7, Neat NR, DVD Architect 6.0, MAGIX Travel Maps, Sound Forge Pro 16, SpectraLayers Pro 11, iZotope RX11 Advanced and many other iZ plugins, Vegasaur 4.0

Windows 11

Dell Alienware Aurora 11:

10th Gen Intel i9 10900KF - 10 cores (20 threads) - 3.7 to 5.3 GHz

NVIDIA GeForce RTX 2080 SUPER 8GB GDDR6 - liquid cooled

64GB RAM - Dual Channel HyperX FURY DDR4 XMP at 3200MHz

C drive: 2TB Samsung 990 PCIe 4.0 NVMe M.2 PCIe SSD

D: drive: 4TB Samsung 870 SATA SSD (used for media for editing current projects)

E: drive: 2TB Samsung 870 SATA SSD

F: drive: 6TB WD 7200 rpm Black HDD 3.5"

Dell Ultrasharp 32" 4K Color Calibrated Monitor

 

LAPTOP:

Dell Inspiron 5310 EVO 13.3"

i5-11320H CPU

C Drive: 1TB Corsair Gen4 NVMe M.2 2230 SSD (upgraded from the original 500 GB SSD)

Monitor is 2560 x 1600 @ 60 Hz

SWS wrote on 5/10/2020, 8:14 AM

Dot's got it and here's a graphic example of the wacky-world of 3:2 pulldown...

https://www.extron.com/article/32pulldown

BOXX/APEXX S4
Motherboard: ASRock TAICHI
Intel Z690 Chipset Cores:16
CPU: Intel Core i9 12900KS Enhanced Performance Processor
GPU: NVIDIA GeForce RTX 3090
RAM: 64GB DDR5-4800 MHz (2 - 32GB DIMMS
Disks: 2.0TB SSD NVMe/PCIe 3.0/4.0 M.2 Drive
SSD: (4) 4TB
O/S: Microsoft Windows 10 Professional 64-bit SP1

Former user wrote on 5/10/2020, 8:24 AM

23.97fps and 24fps in TV are aberrations. This was created to appease the people who thought film had to be shot at 24fps. I am not sure why it had to be slowed to 23.97 but I am sure it is related to the same reason we have 29.97 instead of 30. Now lets get into Drop Frame vs. Non-Drop timecode! :)

Former user wrote on 5/10/2020, 8:25 AM

and remember your home movies, shot on film (8mm or super 8mm) were between 12fps and 18fps. This really screws up transferring to video.

Former user wrote on 5/10/2020, 8:27 AM

In the modern digital age, there really isn't a PAL vs. NTSC. It is basically just different frame rates.

Dexcon wrote on 5/10/2020, 9:15 AM

@Former user ... lot's of great info you've given. Thanks.

Re 8mm/Super 8mm, I didn't ever have those when I was young (though it was a dream - but 16mm). Looking at docos on the History Channel or the like, the tech that can make 8mm or hand-cranked roughly 16 fps footage from the very late 19th or early 20th centuries look fluid is truly remarkable. I guess that they don't use Vegas Pro, Resolve, Premiere etc to do that.

Thanks for clarifying that it was 'vertical blanking' - my memory was close with 'vertical sync' but not close enough. And pointing out the difference between rasters (analogue) and pixels (digital) was a good reminder.

When digital TV first came in some 2 decades ago, I erroneously thought that the NTSC/PAL differences would be a thing of the past (as some TV tech-heads said at the time). But sadly it seems to have got more complicated in the NTSC world.

Re 3:2 pull-down - I'll probably never get my head to work this out mainly because I don't have to deal with it with PAL.

@SWS .. I'll have a closer look at the link tomorrow - many thanks for the link.

Last changed by Dexcon on 5/10/2020, 9:24 AM, changed a total of 1 times.

Cameras: Sony FDR-AX100E; GoPro Hero 11 Black Creator Edition

Installed: Vegas Pro 15, 16, 17, 18, 19, 20, 21 & 22, HitFilm Pro 2021.3, DaVinci Resolve Studio 19.0.3, BCC 2025, Mocha Pro 2025.0, NBFX TotalFX 7, Neat NR, DVD Architect 6.0, MAGIX Travel Maps, Sound Forge Pro 16, SpectraLayers Pro 11, iZotope RX11 Advanced and many other iZ plugins, Vegasaur 4.0

Windows 11

Dell Alienware Aurora 11:

10th Gen Intel i9 10900KF - 10 cores (20 threads) - 3.7 to 5.3 GHz

NVIDIA GeForce RTX 2080 SUPER 8GB GDDR6 - liquid cooled

64GB RAM - Dual Channel HyperX FURY DDR4 XMP at 3200MHz

C drive: 2TB Samsung 990 PCIe 4.0 NVMe M.2 PCIe SSD

D: drive: 4TB Samsung 870 SATA SSD (used for media for editing current projects)

E: drive: 2TB Samsung 870 SATA SSD

F: drive: 6TB WD 7200 rpm Black HDD 3.5"

Dell Ultrasharp 32" 4K Color Calibrated Monitor

 

LAPTOP:

Dell Inspiron 5310 EVO 13.3"

i5-11320H CPU

C Drive: 1TB Corsair Gen4 NVMe M.2 2230 SSD (upgraded from the original 500 GB SSD)

Monitor is 2560 x 1600 @ 60 Hz

adis-a3097 wrote on 5/10/2020, 9:52 AM

Framerates:

A playlist on analog TV:

https://www.youtube.com/playlist?list=PLv0jwu7G_DFUGEfwEl0uWduXGcRbT7Ran

Cheers! :)

rraud wrote on 5/10/2020, 9:56 AM

A lot of the frame rates originally had to do with the AC power frequency, (lighting and CRT monitors) , i.e., NTSC 60Hz; PAL 50Hz, That said, video is not my primary area of expertise

Howard-Vigorita wrote on 5/10/2020, 2:49 PM

As mentioned above, the look and feel of different frame rates plays a big part in public perception. The slower 24/23.937 fps rate is strongly associated with big wide-screen Cinema since it was used there for so long. The 30/29.97 ntsc rate used by the largest early tv market got its distinctively different look and feel imprinted on the public consciousness too. The 1st major departure in the US were the soaps that started filming at double the usual frame rates and rapidly developed their own look. I've had some of my nicest sharpest footage rejected because it "looks too much like a soap." Reality tv with its jerky high-contrast look has developed a similar association. As has what I call the "reality ghost show" look which is probably the worst of all worlds.

I've always thought of PAL as mister inbetween... ha, ha, feel a sermon coming on... and the topic will be...