25p on a 60hz monitor

Mindmatter wrote on 5/22/2015, 2:42 PM
Hi all,

this may seem trivial, but since I saw that question in some other forum a while back, I couldn't get my head around it, and to be honest, I have never ever given it a thought...
why are computer monitors running at 60hz , even in 50hz grid PAL countries? Should that not mess with our 25p footage in some way - or does it and I / we just got used to it?
Is there a recalculation of some sort somewhere down the line?
Sorry, never was any good at math...

probably an easy one for the technically inclined people around here...
thanks for helping me clear that one up!

AMD Ryzen 9 5900X, 12x 3.7 GHz
32 GB DDR4-3200 MHz (2x16GB), Dual-Channel
NVIDIA GeForce RTX 3070, 8GB GDDR6, HDMI, DP, studio drivers
ASUS PRIME B550M-K, AMD B550, AM4, mATX
7.1 (8-chanel) Surround-Sound, Digital Audio, onboard
Samsung 970 EVO Plus 250GB, NVMe M.2 PCIe x4 SSD
be quiet! System Power 9 700W CM, 80+ Bronze, modular
2x WD red 6TB
2x Samsung 2TB SSD

Comments

balazer wrote on 5/22/2015, 3:11 PM
'Cause, computers? I imagine that video frame rates were never a big enough issue for computer manufacturers to be concerned about them. For a long time, computers didn't have any video playback capability. They just picked one refresh rate and went with it, higher being better. Of course computer and monitor manufacturers were predominantly American and Japanese (both 60 Hz countries, at least in parts), which may have been a factor.

At playback time frames will be repeated as necessary to get from 25 fps to 60 fps. For each second of video, 15 frames are played twice and 10 frames are played 3 times. It amounts to a 2:3:2:3:2 pattern of repetition, though it's actually dynamic and determined automatically by the video subsystem, not the playback software.

For 24 fps content played at 60 fps, it's similar: 12 frames in each second are played 3 times, and 12 are played two times, in a 3:2 repeating pattern. It's called 3:2 pull-down.

When the playback frame rate is much higher than the content frame rate (by a factor of roughly 2.5 or more), our eyes aren't too sensitive to the repetition or its pattern. At smaller ratios, irregularities in the repetition pattern can become noticeable, so the repetition should be kept regular or avoided altogether by blending or interpolating frames. Converting from 50 fps to 60 fps, for example, doesn't look good when it's done by repeating frames.
Mindmatter wrote on 5/22/2015, 3:30 PM
Thanks Balazer!

AMD Ryzen 9 5900X, 12x 3.7 GHz
32 GB DDR4-3200 MHz (2x16GB), Dual-Channel
NVIDIA GeForce RTX 3070, 8GB GDDR6, HDMI, DP, studio drivers
ASUS PRIME B550M-K, AMD B550, AM4, mATX
7.1 (8-chanel) Surround-Sound, Digital Audio, onboard
Samsung 970 EVO Plus 250GB, NVMe M.2 PCIe x4 SSD
be quiet! System Power 9 700W CM, 80+ Bronze, modular
2x WD red 6TB
2x Samsung 2TB SSD

GeeBax wrote on 5/22/2015, 5:22 PM
'why are computer monitors running at 60hz , even in 50hz grid PAL countries?'

The truth is that they are not, computer monitors these days can run at whatever frame rate is sent to them by the video system in the computer.
PeterDuke wrote on 5/22/2015, 7:14 PM
What happens if a web page, say, contains a video? Is the web page displayed at the video frame rate? What would happen if the page had two videos with different frame rates?
PeterDuke wrote on 5/22/2015, 7:18 PM
If I open Display Properties>Settings>Advanced>Monitor, the only screen refresh rate I have is 60 Hz. (Win XP system). I have 50 Hz power.

When I had a CRT monitor, I had a choice of several refresh rates.
balazer wrote on 5/22/2015, 8:45 PM
The truth is that they are not, computer monitors these days can run at whatever frame rate is sent to them by the video system in the computer.

That was true in CRT days, but many LCD computer monitors are limited to 60 Hz.
Chienworks wrote on 5/22/2015, 9:09 PM
The monitor refresh rate and the video frame rate are completely independent of each other and ne'er the twain shall meet! What happens is that the computer updates the memory buffer holding the video display at the video's frame rate, and then the monitor shows whatever that current buffer contains when it refreshes the screen. This is true no matter what the video's frame rate is or the monitor's refresh rate may be. Neither item cares about or even knows about the other's rate. In fact, two or more different videos can be playing at different frame rates on the same screen, while the monitor refresh rat might not match any of them.

Can this cause some issues? It might. In the case of showing 25p on a 60Hz monitor, we'll see one frame for two 1/60ths of a second, then another for two, but the third frame will only be displayed for one 1/60th second refresh, then the pattern repeats. This means that we see some frames longer than others, and this pattern will be different for each different video frame rate & monitor refresh rate combination.

Does it matter? Answer this: were you even aware of this phenomenon before reading this forum thread? Can you see it happening now that you know it is? No? Then it doesn't really matter much at all, does it?
PeterDuke wrote on 5/22/2015, 10:39 PM
If that is the case then perhaps we don't actually see distinct frames or fields but whatever happens to be in the buffer at the time. For example, with interlaced 59.97 we could see part field 1, all of field 2 and part of field 3 for 1/60 second. Since the field rate is slightly different from the refresh rate, then the place where field 1 becomes field 3 (and eventually where field 2 becomes field 4 and so on) will roll with time.
relaxvideo wrote on 5/23/2015, 1:37 AM
Many lcd monitor with HDMI input is capable for 50 Hz. I use it since years. Via DVI only 60 is possible.
In the old crt days, 50 Hz was very flickering, 60 was the first usable refresh rate. I like 85 and 100Hz at those days.

#1 Ryzen 5-1600, 16GB DDR4, Nvidia 1660 Super, M2-SSD, Acer freesync monitor

#2 i7-2600, 32GB, Nvidia 1660Ti, SSD for system, M2-SSD for work, 2x4TB hdd, LG 3D monitor +3DTV +3D projectors

Win10 x64, Vegas22 latest

farss wrote on 5/23/2015, 6:29 AM
Anyone that's really concerned about it would use a real video monitor connected via a video interface e.g. SDI. That's an expensive road to go down though.

Bob.
Mindmatter wrote on 5/23/2015, 11:54 AM
Well as I said - I've never given it any thought at all or noticed anything problematic...I just started thinking about it when the question started to intrigue me.

AMD Ryzen 9 5900X, 12x 3.7 GHz
32 GB DDR4-3200 MHz (2x16GB), Dual-Channel
NVIDIA GeForce RTX 3070, 8GB GDDR6, HDMI, DP, studio drivers
ASUS PRIME B550M-K, AMD B550, AM4, mATX
7.1 (8-chanel) Surround-Sound, Digital Audio, onboard
Samsung 970 EVO Plus 250GB, NVMe M.2 PCIe x4 SSD
be quiet! System Power 9 700W CM, 80+ Bronze, modular
2x WD red 6TB
2x Samsung 2TB SSD

wwjd wrote on 5/23/2015, 12:08 PM
thread broke my brain
farss wrote on 5/23/2015, 3:55 PM
In the latest versions of Vegas there's a new setting in the Secondary Preview Monitor preferences; "Wait For Vertical Sync". I suspect that's there to prevent issues that could arise from asynchronous read / write to video RAM e.g. torn frames. From memory several people have complained about this over the years.

Bob.

John McCully wrote on 5/25/2015, 12:36 AM
FYI

http://www.prad.de/en/monitore/review/2010/review-dell-u2410-part17.html#Video

I now live in a PAL country and import NTSC cams from B&H as I find the 50p motion artifacts annoying. I don't shoot at 24 or 25p.

Cheers....
farss wrote on 5/25/2015, 7:02 AM
There's a good reason why 25/50/100 fps is used in PAL countries.
I've tried the reverse, shooting 50i in a 60Hz country and most of the time it's OK and then you find a place with some discharge lamps for lighting and then it get ugly.

Bob.
John McCully wrote on 5/27/2015, 4:12 AM
The other side of the coin! It all depends. I shoot outdoors, almost invariably, and my footage (amateur hobbyish scenic wildlife sailboats and so on) is viewed on friends and family computers such as the Dell mentioned in the article.hence my 60p preference. Shooting 50p outdoors results in the herky jerkies that looks like 24p but one hundred times worse.

But yes, indoors with lights - a hole nother thing. Choose your weapon accordingly...
balazer wrote on 5/27/2015, 6:10 PM
You can also shoot 30 fps with a 1/50 s shutter to avoid flicker in Europe.
Mindmatter wrote on 5/28/2015, 5:13 AM
Would ,then, 30p with a 1/50th shutter result in a smoother picture, albeit subliminally smootherthan 25p?

AMD Ryzen 9 5900X, 12x 3.7 GHz
32 GB DDR4-3200 MHz (2x16GB), Dual-Channel
NVIDIA GeForce RTX 3070, 8GB GDDR6, HDMI, DP, studio drivers
ASUS PRIME B550M-K, AMD B550, AM4, mATX
7.1 (8-chanel) Surround-Sound, Digital Audio, onboard
Samsung 970 EVO Plus 250GB, NVMe M.2 PCIe x4 SSD
be quiet! System Power 9 700W CM, 80+ Bronze, modular
2x WD red 6TB
2x Samsung 2TB SSD

balazer wrote on 5/28/2015, 1:15 PM
Yes, 30 fps is smoother than 25 fps. But the difference is not huge. Few people would notice.

Personally, I never shoot 24 fps, always preferring 30 fps or 60 fps.