Instead of 30p, try this.

VidMus wrote on 1/21/2015, 9:49 PM
Shoot 60p

Load video at default video specs.

Change the project settings from 60p to 60i by changing the field order to Upper field first.

Video will now show as being 60i

Use properties to turn that pesky resample to off. I tried it without turning it off and had a lot of black frames in the render. I have GPU=on so that might be why.

Render using Main Concept with DVDA widescreen settings and with the stretch video to fill output frame size to = on. Be sure to set the template so that the field order is = upper field first.

Render.

I tried this and the panning test I made looks a lot like 60p.

It is sort of like having the benefits of both 60p and 60i.

There are no interlace problems because the orignal video is 60p.

I got this idea from the (PsF) Progressive segmented frame type of video. Using interlaced to deliver progressive video. I figure there had to be some way for FOX to deliver the games in 60i and still have the benefits of 60p.

Might be worth a try...

Comments

john_dennis wrote on 1/21/2015, 10:30 PM
"[I]I figure there had to be some way for FOX to deliver the games in 60i and still have the benefits of 60p.[/I]"

My local Fox affiliate delivers the game at 1280x720-59.94p.
Laurence wrote on 1/21/2015, 11:07 PM
That recipie will look wonderful on a modern TV, but the DVD will suffer from twitter like crazy if you play it back on an old CRT TV. There is a reason that interlaced cameras average even and odd lines, and no, it's not just for the extra stop.
PeterDuke wrote on 1/21/2015, 11:39 PM
"I tried this and the panning test I made looks a lot like 60p."

And so it should. You have either 60 fields per second or 60 frames per second so panning jitter will be the same.

The difference is that 60i will have half the vertical resolution, particularly if you apply vertical blur (to reduce aliasing) before converting the 60p to 60i. If your content doesn't need the vertical blur then you will win, but that would normally be the exception.
PeterDuke wrote on 1/22/2015, 12:01 AM
"There are no interlace problems because the original video is 60p."

If you rendered to 1080-60i then of course then there will be an interlace issue.

What did you have your project frame rate set to? 60 or 30?

What did you have the deinterlace method set to? None, Blend or Intrerpolate?

What template did you render to? All you said was Main Concept, who makes the coder.
NormanPCN wrote on 1/22/2015, 12:40 AM
"My local Fox affiliate delivers the game at 1280x720-59.94p."

Same here. ABC/ESPN/Disney also.

http://en.wikipedia.org/wiki/High-definition_television_in_the_United_States
Laurence wrote on 1/22/2015, 12:54 AM
Rendering 60p as 60i is very definitely going to give you twitter on an old CRT TV. The question is, how many people still have old CRT TVs? On any modern TV it will look a bit sharper than what we're used to coming from interlaced.
Laurence wrote on 1/22/2015, 1:31 AM
You are doing 16:9 interlaced right? Come to think of it, a DVD player would be dropping every third line and deinterlacing the video for 4:3 CRT playback. As I think of it more, the only time you would run into the twitter that conventional wisdom predicts would be if you had a 16:9 CRT or a 4:3 CRT with a properly set up "compress mode". These are rare indeed. I know because I used to have one. Through a 4:3 CRT, the line droppig and subsequent deinterlacing would totally obliterate any twitter. You would be fine except in the rare instance somebody had one of those rare 16:9 CRTs.

For fun (and maybe to confirm what I'm saying) try the following: Set your DVD player for a widescreen SD TV. What that will give you is 16:9 vertically stretched to 4:3. In this mode there will be no line dropping and subsequent deinterlace, so you should see the twitter I've been warning about. Frankly this would never happen in real life so even though it is theoretically wrong, it should look fine (or likely even better) in all real world scenarios.

This brings back all sorts of memories. A few years back I paid an absolutely stupid amount of money for one of the first CRT HD TVs. No HDMI. Component only for HD. The silly thing weighed about 600 pounds! Anyway, it had a special compress mode that would do 720 x 480 widescreen 60i. You could shoot on something like a VX2000 with an anamorphic adapter and get something that was ever so slightly higher resolution than the norm. In real life it was a joke. The anamorphic adapter lens made autofocus impossible and manual focus extremely difficult. The 60i widescreen only looked good on my own TV. All other CRTs would throw away the 4th line and deinterlace and completely lose any advantage over a simple letterbox. It looked good on plasmas, but they were super expensive and very few people had them. The same was the case with the early HDV cameras that shot SD. You could shoot proper 16:9 60i on these cameras, but you would run into the same issues I just described. It was really an excercise in futility. That whole experience is one of the main reasons I don't blindly trust the experts anymore. I did exactly what the current magazine articles of the time told me to do and it was a joke. Now I use conventional wisdom as a starting point, but always do my own tests before spending money and if my tests tell me something is better than the conventional wisdome, that's what I do.

In your case, there should be twitter problems theoretically. In actual practice though, old 4:3 CRTs combination of dropping every third line and deinterlacing would get rid of this problem, and modern LCD and LED TVs would convert back to progressive and lose the problem in the process as well. The only time you would have the theoretical twitter issue is on an old 16:9 CRT, and I seriously doubt you will ever run into that. Those are extremely rare.
OldSmoke wrote on 1/22/2015, 6:52 AM
@Vidmus
Change the project settings from 60p to 60i by changing the field order to Upper field first.

In another thread here I was told that project settings have no influence on the render with the exception of the de-interlace setting. what version of Vegas are you on? Anyhow, I have been saying it since two years that shooting 60p is the way to go provided your camera does actually shoot full HD 60p and is designed to do so; see the "Why shoot 60p" thread.

But event then, if you can live with a lower resolution HD like the 1280x720 60p that the HXR-NX5U shoots, it is by far better and easier to convert the SD DVD. I personally also feel that Vegas does a better job donwconverting from 1280x720p instead 1920x1080p but I am no expert to back that up.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

OldSmoke wrote on 1/22/2015, 8:39 AM
@Vidmus

Did you read my post? I am shooting 60p since two years and I have been saying it since two years that you should shoot 60p because it does everything so much better. I started shooting 60p the day I got my HF-G30 which I sold last year to get the AX100.

Edit:
And by the way. Two years ago when I made the comment that 60p is the way to go I was "shot down" by those that said 30p is superior because 30p at a given bitrate has better quality over 60p.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

Chienworks wrote on 1/22/2015, 9:18 AM
"30p at a given bitrate has better quality over 60p"

Well, this is true. 60p has twice as much data to be encoded and compressed as 30p. However, i would assume you would use a correspondingly higher bitrate to make up for it.
OldSmoke wrote on 1/22/2015, 9:29 AM
@Chienworks
Well the argument came form the fact that the HF-G30 which is equal to the XA20 and XA25 shoots 60p and 30p at the same bitrate.
My argument was that one would not need twice the bitrate to get the same quality because the difference between two frames shot in 60p is less, less temporal spacing, when compared to 30p hence the encoder would be able to take more pixels from the previous frame but that doesn't seem to hold up.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

Chienworks wrote on 1/22/2015, 9:49 AM
Perhaps twice the bitrate isn't necessary, but it may be close to that. The point of it though is that using the *same* bitrate will definitely result in lower quality.
OldSmoke wrote on 1/22/2015, 9:55 AM
As always, there is a compromise to be made, quality versus smooth motion, in this case.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

OldSmoke wrote on 1/22/2015, 2:50 PM
@Vidmus

I didn't say that you told me not to shoot in 60p; read my post again.

And again, I don't need to try it, I have been doing this for 2 years!

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

johnmeyer wrote on 1/22/2015, 4:18 PM
My argument was that one would not need twice the bitrate to get the same quality because the difference between two frames shot in 60p is less, less temporal spacing, when compared to 30p I've studied this many times, but I've never found an authoritative source which provides anything approaching a formula. Oldsmoke is definitely correct: you do NOT need to double the bitrate to achieve the same spatial quality with 60p that you get with the same resolution 30p, for exactly the reasons he states: with 60p you are encoding events much closer together in time, and since less "information" has changed, fewer bits are required to get to the same level of spatial quality. However, you are encoding twice as many frames, so that does indeed increase the number of bits, but not by the same ratio.

I've tried Google, but it is a tough search question to formulate because "bit rate," "frame rate," and "spatial quality" are used in almost every paper about video encoding.

OldSmoke wrote on 1/22/2015, 4:40 PM
@John Meyer

Is there a way to test this by rendering out of Vegas? Let's say a 4K file to HD with a given bit rate at 30p and 60p? Would that proof anything or is in camera processing totally different from what an NLE does?

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

OSV wrote on 1/22/2015, 8:03 PM
"Oldsmoke is definitely correct: you do NOT need to double the bitrate to achieve the same spatial quality with 60p that you get with the same resolution 30p, for exactly the reasons he states: with 60p you are encoding events much closer together in time, and since less "information" has changed, fewer bits are required to get to the same level of spatial quality. "

you are referring to shooting 1080p60 and delivering at 720p60? because there aren't any delivery standards for 1080p60.

most cameras shooting 1080 lines can't record the full 1080 lines of resolution, so maybe there isn't much resolution loss in that situation, when downrezzing to 720 lines for delivery.
OldSmoke wrote on 1/22/2015, 8:25 PM
you are referring to shooting 1080p60 and delivering at 720p60?

No. We are talking about shooting 60p to deliver to 60i or any format, that is the beauty of 60p. You can deliver it as 30p, 60i and 60p although only 720p but you will have a hard time to see the difference to 1080 60i and 720 60p.

The worst in my opinion is 30p because aside from the Internet there isn't much you can with it and that is changing to 60p.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

johnmeyer wrote on 1/23/2015, 12:46 AM
I was only referring to encoding 60p vs. encoding 30p, without regard to delivery format or vehicle.

While I don't know how to measure relative spatial quality, constructing the test is pretty simple. Start with a 60p source and then decimate it to 30p (easy to do in Vegas). This gives you the identical media, but at two frame rates. You then encode both the 60p and 30p to 60p and 30p, using AVC or some other long GOP codec. Use the same bitrate for both. Take the results of both renders, line them up on a new Vegas timeline, set project properties to 60p, turn off resample for the 30p event, and then A/B the two. You should be able to see a difference.

However, I don't know how create a scientific way to measure the "goodness" of one encode vs. the other when doing this A/B comparison. I know that some people have used difference or difference squared, but I've never figured out how to use this to actually measure quality.
PeterDuke wrote on 1/23/2015, 1:54 AM
"However, I don't know how create a scientific way to measure the "goodness" of one encode vs. the other when doing this A/B comparison. I know that some people have used difference or difference squared, but I've never figured out how to use this to actually measure quality."

Quality is a subjective concept, so you measure it in a subjective, not objective test. This was my bread and butter job in a research laboratory for many years, only it was on audio, not video. (Since then, mobile phones with their atrocious quality have taken over the world, and now all my work has been for nothing!)

You could have a panel of observers give a direct rating of quality out of say 10, or you could use descriptors such as bad, poor, fair, good and excellent and assign the numbers 0-4 (British method) or 1-5 (American method) to arrive at a numerical score. (Things had to be done differently on each side of the Atlantic! PAL versus NTSC, 30 channel versus 24 channel PCM telephone systems, 50 Hz versus 60 Hz electicity, drive on the left or drive on the right, etc.)

Alternatively, you could do paired comparisons, and determine the percentage of observers who preferred one sample over the other. When the two were equally preferred, you could say that they had the same quality. You could do this for a range of coding parameters and video content.

There are other ways too, but you get the idea. The key is that it is the judgement of a panel, and not an individual.

Fellow engineers at work used to shudder at the thought of this sort of work. (Initially we had no experimental psychologists, but they came later and worked on other fields.) Sometimes I asked myself how people could meaningfully carry out such judgements, but when I plotted the score against some controlled variable, I was reassured to see that the score varied in a logical or consistent way.

Yes, subjective tests are labour intensive and time consuming, but if it is the only way...
johnmeyer wrote on 1/23/2015, 9:42 AM
Someone needs to insert this concept into that 4K thread. There are people there making all sorts of claims about how great 4K is, but according to most casual reports, no one can really tell the difference between 4K and good HD when viewed on a display under 55 inches.

It is a great acquisition format, of course; I'm just questioning its value for the average consumer.

By contrast, if you had people watch sports programming in 24p, 30p, 50i, 50p, 60i, and 60p, I think it would be pretty easy to see the differences, although it would be fascinating to see how much difference a panel would discern between the interlaced and progressive formats. Interlaced is a pretty solid "trick" for getting "almost" the full vertical resolution (it is not half, as was stated earlier), and "almost" the temporal effect (during pans, etc.) of full 50 or 60 fps progressive.
OldSmoke wrote on 1/23/2015, 9:52 AM
[I]There are people there making all sorts of claims about how great 4K is, but according to most casual reports, no one can really tell the difference between 4K and good HD when viewed on a display under 55 inches.[/I]

It all depends on the distance you are sitting from the screen but I would agree that 4K should be viewed in screens larger then 55", I personally would say around 70" is a good starting point.

Nevertheless, download this short sample and see for yourself how far you can crop inside without loosing detail.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

wwjd wrote on 1/23/2015, 9:56 AM
Never understood why people don't like the "look" of 60p compared to the older days... 60 is clearer, "Better", smoother, more real... don't know why people wouldn't want MORE REAL in LIVE things they film? Don't the moving pictures try to represent the REAL thing that was filmed?
It's just new and people don't like change.
It doesn't have to look like a soap opera, unless it was LIT like a soap opera.
johnmeyer wrote on 1/23/2015, 11:58 AM
60 is clearer, "Better", smoother, more real... don't know why people wouldn't want MORE REAL in LIVE things they film? Don't the moving pictures try to represent the REAL thing that was filmed?Most definitely: No. Real <> Better.

Let's be clear. If we are talking about theatrical movies, they do NOT attempt to represent anything real. The whole concept is, deliberately, to produce a fantasy.

I'll admit that 24p is mostly the product of the limitations imposed by moving celluloid through a tortured mechanical process without breaking. It was not created in order to help produce the famous "suspension of disbelief." Nonetheless, it has the serendipitous effect of heightening the illusion that what we are watching is not entirely real.

Some have stated that this is simply a function of having become accustomed to this cadence and look, but I'm pretty certain it is much more a function of how human persistence of vision works, at a very primitive level.

I'll also be the first to agree that 60p virtually eliminates all sort of nasty motion artifacts, and the progressive part of it sure makes certain editing tasks a lot easier.

However, if you watch "The Wizard of Oz" on a TV that interpolates to a high frame rate, it will almost completely lose its charm and mystery: the road down the yellow brick road will look about the same as home video taken at Disneyland.