HDV to SD deinterlace: surprise!

Comments

johnmeyer wrote on 12/9/2008, 8:29 AM
I certainly hope that the scaling of (interlaced material with the same framerate) is done so that the source fields are scaled separately, and then just used as the source for the final output (with correct field order), instead of FIRST deinterlacing the source material to one full resolution frame, that is then scaled and re-interlaced at the output format. Probably it is done according to the first method, othervice you would lose some temporal information.Perfectly said. But of course, because Sony has gone home and has chosen to never again engage with its customers, we'll never know what is really going on, or exactly what to do about it.
Coursedesign wrote on 12/9/2008, 8:58 AM
Whatever your focussing is (or optical limitation), you have 1080 vertical lines in the image

Strange, I have 1080 horizontal lines. :O)

If there is anyone left who hasn't realized that Vegas is an interlaced-focused NLE, they haven't read this thread.

Note also that the MPEG-2 codec (the only choice for SD DVDs) was designed only for progressive footage.

But MPEG-4 as used in AVC for Blu-Ray truly supports interlaced.

I guess this must mean that the future is interlaced? :O|

(Seriously, I don't think we'll get rid of the frickin' interlaced video until the last TV engineer has been put to his final rest. I would recommend supergluing his casket lid and putting 100+ bolts around the edge to prevent any last minute surprises.)

Christian de Godzinsky wrote on 12/9/2008, 12:45 PM
Couldn't agree more on the detail about the TV-engineers... Should we put the bolts around the coffin horizontally or vertically ??? ;) Thx for correcting my error :)

Interlacing was at the time the only (and clever) way to stuff video signals though tubes and radio transmissions with limited bandwidth - at reasonable frame rates. That's a very loooooong time ago so that's its almost ancient technology. Interlacing - RIP as soon as possible. You will not be missed.

Christian

PS: What do you mean with "...the MPEG-2 codec (the only choice for SD DVDs) was designed only for progressive footage..." ?

WIN10 Pro 64-bit | Version 1903 | OS build 18362.535 | Studio 16.1.2 | Vegas Pro 17 b387
CPU i9-7940C 14-core @4.4GHz | 64GB DDR4@XMP3600 | ASUS X299M1
GPU 2 x GTX1080Ti (2x11G GBDDR) | 442.19 nVidia driver | Intensity Pro 4K (BlackMagic)
4x Spyder calibrated monitors (1x4K, 1xUHD, 2xHD)
SSD 500GB system | 2x1TB HD | Internal 4x1TB HD's @RAID10 | Raid1 HDD array via 1Gb ethernet
Steinberg UR2 USB audio Interface (24bit/192kHz)
ShuttlePro2 controller

johnmeyer wrote on 12/9/2008, 1:20 PM
Note also that the MPEG-2 codec (the only choice for SD DVDs) was designed only for progressive footage.I don't understand that at all. Obviously MPEG-2 supports interlaced, and was designed with that in mind. If you doubt this, remember that MPEG-1 (think VCD) only supported progressive. Interlacing support was one of several main additions when going from MPEG-1 to MPEG-2.

Interlacing was at the time the only (and clever) way to stuff video signals though tubes and radio transmissions with limited bandwidth - at reasonable frame rates. That's a very loooooong time ago so that's its almost ancient technology. Interlacing - RIP as soon as possible. You will not be missed.That is a very correct reading of the history of how modern TV was invented during the 1940s. However, it overlooks the extremely important fact that less than a decade ago, when the digital HD standards were developed and adopted, lots of engineers made the decision to provide 1080i, an interlaced standard, as one of the main HD delivery formats. Reason? Because despite what some people (incorrectly) believe, interlaced video looks damn good. I don't disagree for a moment with those that point out the problems some editing programs have with it. But, properly edited and shot, interlaced video looks absolutely great.

Now, if you can do 60 frames a second, progressive, at 1920x1080, would that look better? Yup. But that's another 2x bump up in data rate, storage, processor power, etc., and except for multiple cores and CPUs, we are nearing the end of processor performance improvements, so editng something that big is going to be a bitch for some time to come.

farss wrote on 12/9/2008, 1:29 PM
This is a very old topic. It first came up around V5. At least for me that's when I first got involved in it. It was found that during PAL to NTSC conversion better results were obtained by setting a de-interlace method. I don't recall for certain but I think a Sony person might have been involved.

Since then I've done some investigation of how broadcast equipment does this. All scalers and standards converters use the same technique, you have to involve both fields to derive each field. As I said previously such equipment may use adaptive motion compensation to align the two fields. Temporal resolution is not lost. If you watch any SD broadcast that was shot in interlaced HD it has been through much the same process.

If you want to study what is going on using footage from a HDV camera is confusing. The motion blur hides the process. I used Vegas generated interlaced HD test footage. If I can find the time I'll do the same tests again and post some images. What you will see happen using Blend is each field will have what looks like motion blur that was not in the original footage created with a 0 deg shutter angle. Temporal separation between the fields IS maintained.

Bob.

Christian de Godzinsky wrote on 12/9/2008, 2:36 PM
John, you are right on the point here, interlace lives on -but only due to bandwith limitations - or should I say bits per second limitations.

That is the ONLY reason we still have interlace. Ok, going to 25p would solve the bandwidth problem, but you would then loose temporal information. However, some people that are in favor of 25p would not have any problems with this....

I also fully agree that 50/60 fields per second (interlaced) gives you a very realistic feeling, and just looks stunning great. Some people want the "film" look, some people want the "realistic" look... Go figure.

It seem that the apetite is always greater than the available food. We always need more BW that currently available. It is the delivery medium or media that still dictates the limits.

I have in my living room a TV that can show 1080p ! Such cams exists. Just the delivery media is missing...

Still I think that interlacing is not a blessing, taking in consideration all the mess it causes in this digital era...

Christian

PS: SCS forum admin, do you monitor us at all ? Would be nice to really hear your comments about the original subject....

WIN10 Pro 64-bit | Version 1903 | OS build 18362.535 | Studio 16.1.2 | Vegas Pro 17 b387
CPU i9-7940C 14-core @4.4GHz | 64GB DDR4@XMP3600 | ASUS X299M1
GPU 2 x GTX1080Ti (2x11G GBDDR) | 442.19 nVidia driver | Intensity Pro 4K (BlackMagic)
4x Spyder calibrated monitors (1x4K, 1xUHD, 2xHD)
SSD 500GB system | 2x1TB HD | Internal 4x1TB HD's @RAID10 | Raid1 HDD array via 1Gb ethernet
Steinberg UR2 USB audio Interface (24bit/192kHz)
ShuttlePro2 controller

johnmeyer wrote on 12/9/2008, 5:56 PM
If I can find the time I'll do the same tests again and post some imagesWell, I created some footage, admittedly a pathological case. I used AVISynth to create 1440x1080 video with the odd scan lines red and the even scan line blue. I then ran this through another script which separated the video into fields, which gives me twice the number of frames, each half the height of the original video. As expected, the video alternated between red and blue frames.

I then rendered the original, non-separated video (which looks purple from a a distance) using various templates, and then put the results back into the script which separates fields. The results were surprising.

The Cineform codec broke down completely and I ended up with green. I have no idea what is going on there. I haven't used this codec for awhile so maybe some update has screwed it up. I know that many users have posted problems with Cineform. Since that's not part of this exercise, I'll look into that later.

I rendered to m2t, and to my surprise, when I put that through the separator, I ended up with perfect red/blue separation.

I then got to the main event, which was to render to the NTSC Widescreen template. I first tried this with deinterlace set to "none." Sure enough, when I separated fields, I ended up, within each field, with a pattern of horizontal red and blue lines, as if the individual field was made up of alternating fields, except that the horizontal lines were more than one scan line thick. This is exactly what I saw in the stills I posted at the beginning of this thread. So, this does not work at all.

I then changed the deinterlace to "blend" and rendered again to the NTSC Widescreen template. This time when I separated the fields, I got perfect red then blue then red, etc. fields, with no distortion or any artifacts. Thus, Vegas does a perfect job of spatially separating the fields.

What I did not do in this test was make any attempt to introduce motion. I guess what I should do is repeat the test except this time have a blue ball move diagonally across a red field, with the blue and red, as before, being on alternate scan lines.

If I have time ...
johnmeyer wrote on 12/9/2008, 6:11 PM
I created a test case with a blue ball, created with only even scan lines, moving over a background of red, which was only on the odd scan lines. When rendered to NTSC DV, with deinterlace set to blend, I didn't see any obvious problems.

Setting to interpolate, however, created an absolute mess, with things turning green and much more.

I don't have any more time to look into this. Since it is a pathological case, it may not reveal much, but certainly I will be using Blend and not Interpolate when going from HDV to SD.
Laurence wrote on 12/9/2008, 8:20 PM
Just to clarify, you are talking about going from 1080i to interlaced SD right? If you go from HD interlaced to SD progressive, there is a difference between using the "blend fields" and interpolate options, but when I go from 1080i to SD interlaced, I can see no difference whatsoever. Both options look the same, but if I uncheck the "select deinterlace method" tab I get all sorts of resized interlace artifacts.
Tim L wrote on 12/9/2008, 8:47 PM
I'm so far out of my league in this discussion that I hesitated to post, but maybe I'll learn something:

Is there any advantage to setting the project properties to "59.940 (Double NTSC)"? Does that make Vegas treat the individual fields like individual frames -- maybe avoiding mixing the two?

I'm just wondering if the "double ntsc" or "double pal" settings result in scaling that uses just one field. (Or, if I don't understand correctly can anybody explain what the "double" frame rates are used for?)

Tim L
Laurence wrote on 12/9/2008, 9:03 PM
The doubled rates are used for high frame rate progressive video which is where we are all headed eventually.
farss wrote on 12/10/2008, 5:05 AM
Finally found something about this process:

http://www.sensoray.com/support/video_scaling.htm

I haven't studied it in depth however it does sound very much like what we're seeing Vegas do. At least this article has pictures that helps to understand the processing.

Bob.