30p or 60i?

Comments

amendegw wrote on 1/24/2010, 11:01 AM
Quoting myself: "I'm confused. The OP wants to render mp4 to Vimeo (30p, I assume). Is he better off shooting in Canon's faux 30p or 60i? And what would one expect the difference to be in the rendered 30p mp4?

Quoting Andy L: "The difference is big. If you shoot 60i, you have to convert that into a 30p mp4 for YouTube/Vimeo..."So, I decided to do a test. I don't think I can see a nickel's worth of difference in quality between a 30p clip and a 60i clip. I would have rather used Vimeo than YouTube, but my free account limits the number of HD clips I can upload.

Using my Canon Vixia HG21, I shot a few seconds in the 30p mode (FXP - 17mbps), clicked on the file to set the Project Settings - then changed the interlacing to "Progressive". Then rendered the result using the builtin Sony AVC/Internet 16:9 HD template. Here's the result:



I did the same test with the Camera set to 60i (this time I kept the project settings as "Interlaced") and rendered to the Sony AVC template. Here's the 60i result:



Can anyone see any differences in quality between these two video clips?

...Jerry

PS: Both videos are rather washed out as it was a pretty gray day.
PPS: Obviously, you'll want ot view these guys as fullscreen/720p

System Model:     Alienware M18 R1
System:           Windows 11 Pro
Processor:        13th Gen Intel(R) Core(TM) i9-13980HX, 2200 Mhz, 24 Core(s), 32 Logical Processor(s)

Installed Memory: 64.0 GB
Display Adapter:  NVIDIA GeForce RTX 4090 Laptop GPU (16GB), Nvidia Studio Driver 566.14 Nov 2024
Overclock Off

Display:          1920x1200 240 hertz
Storage (8TB Total):
    OS Drive:       NVMe KIOXIA 4096GB
        Data Drive:     NVMe Samsung SSD 990 PRO 4TB
        Data Drive:     Glyph Blackbox Pro 14TB

Vegas Pro 22 Build 239

Cameras:
Canon R5 Mark II
Canon R3
Sony A9

Andy_L wrote on 1/24/2010, 4:52 PM
Okay, if you're shooting 1080i and converting to 720p (which is what I do for YouTube), Vegas' "Interpolate" option to deinterlace isn't really costing you much if any vertical resolution, so that ought to look the same as 1080i shown on a computer monitor, which is forced to be deinterlaced anyhow. If I understand what you're doing correctly, you are basically comparing two identical images.

On a somewhat related note, many people argue that 720p and 1080i look pretty much identical anyhow.
John_Cline wrote on 1/24/2010, 8:00 PM
"many people argue that 720p and 1080i look pretty much identical anyhow"

And they need to have their eyes examined by a professional.
Rory Cooper wrote on 1/25/2010, 5:40 AM
Well amendeqw asked a good question can anyone see a difference between the two clips? Other than the 30p looking richer in color …Nyet comrade not a thing..nothing
Thanks for the effort as well

You go into a store and buy a loaf of bead cut into 30 slices which you eat
The next guy buys the same loaf but cuts each slice in half = he got the same amount of bread 30 slices. it looks like more,but it isn’t

Who got the better deal? The guy who didn’t waste his time cutting the slices in half to make it go further

Gotta go now ..having my eyes tested

John_Cline wrote on 1/25/2010, 12:53 PM
If you're talking about real 720-60p (not 720-30p) vs 1080-60i, they both have a temporal resolution of 60 images per second. I can see the judder in a video with a temporal resolution of 30p, I can't with 60p (or 60i for that matter.)

Of course, if you're making a 30 fps Internet video, it doesn't matter whether the original footage was 30 or 60.
farss wrote on 1/25/2010, 1:21 PM
I have no difficulty telling the difference between 25p and 50i source material on my SD CRT TV and monitors.

If you're talking about HD then I wonder how many displays show 60i correctly or 30p for that matter. I'd suspect this contributes largely to the "I can't see the difference" conclusion. With the trend to HDTVs that do frame interpolation to give smooth motion the difference could be impossible to detect.

Bob.
BudWzr wrote on 1/25/2010, 1:24 PM
It's like a "Did a bear crap in the woods" dilemma.

Even if 1080 is slightly better than 720, it's not worth all the extra effort if most people can't tell.
John_Cline wrote on 1/25/2010, 1:45 PM
"Even if 1080 is slightly better than 720, it's not worth all the extra effort if most people can't tell."

Yeah well, some people can tell the difference. The phrase "good enough" is not in my vocabulary and it really tweaks me when I hear someone else say it. Where is the pride in your work? I sweat the details, that's one of the things that separates pros from amateurs. What if Jim Cameron had that attitude? Do you think that Avatar would be on its way to being the top grossing film of all time?

Besides, if it was shot 1080, there is no "extra effort" required to do the project in 1080.
Rory Cooper wrote on 1/25/2010, 10:25 PM
John that’s the real issue the short fall is in 25 frames. Trying to fix that with 50i isn’t a solution; we have to accept 25 frames or 30f for what it is

Hopefully Camera manufactures in the future will give us what we need across the board Pro AND Consumer instead of following tradition

The other issue about quality, sure you should always deliver your best and strive to improve your techniques and abilities. now in this regard

Project size is irrelevant = I got a big Picasso or I got a small Picasso
The project size will be determined by the limitations of either your plasma or DVD or internet

So with full HD footage I will lose some resolution in production so why not discard it at the start why hang on to it as if it’s somehow going to improve the quality of your project. IT’S NOT
You are going to lose it in final render …get over it ..so say good buy at the start this will speed up your workflow
Sure keep the raw footage and give it to the client and if he upgrades he can repay you pro rata to re - render the job

Most people on the forum say work in full HD….what for?.... so I can render to PAL???? That’s crazy