General discussion - image sharpness/crispness

rdwhitehill wrote on 5/19/2004, 3:35 PM
Hello everyone,

I'd like to open a general discussion about the "sharpness" and "crispness" of video files viewed on computers (such as via MediaPlayer) vis a vis as viewed after written out to DVD or DVtape and viewed on a TV/video monitor.

I'm working with standard 720x480 pixel video files as rendered out of SBMS v3.0 using the NTSC DV AVI template.

I do understand screen size/resolution issues, and the differences between computer monitor resolution and TV/video monitor resolution.

But I notice that printed-to-DVD or DV tape NTSC DV AVI files look much much *sharper* and *crisper* than the very same AVIs that are viewed on a computer via MediaPlayer. Regardless of how small or large I size a video file with MediaPlayer, it has a smooth, slight blurry appearance (on the computer).

I also notice jagged edges that horizontally trail along with briskly moving items in my scenes when viewed on a computer, but not on a TV after printing to DVD.

My digital cameras don't use image stabilization, and they are set for their quickest "shutter" speeds to minimize such artifacts at the camera end of things.

Now, you're probably saying that it's my monitor, or my graphics card, or my resolution settings, or something like this. But this appears consistently to be the case on any computer I use whilst working with VideoFactory or MovieStudio. I work in a public school with media production with lots of computers and lots of students and staff, and I see this to be the case on every computer we use, old or new, from one brand to the other.

Just for the record, the fastest machine I'm using is a brand-new Dell XPS machine with 2GB memory on board, XP Pro and a gorgeous 19" LCD monitor. A smokin' PC... so I doubt that it is a dogging old PC that's causing my issues...

I'm not asking for an "answer" here. Rather, I'd like to open up a disucssion thread to hear from others about their experiences along these lines, both positive and negative.

I just want to learn from the responses of others.

So... throw any ideas, suggestions, "did you check..." items, whatever, out to me here. I'm just going to absorb all of it and learn from you all.

Please consider taking the time to share your thoughts or epxeriences about sharpness and crispness and jagged edges with us!

Thanks in advance!
Doug Whitehill

Comments

Steve Grisetti wrote on 5/20/2004, 12:45 PM
I'm not expert, Doug, but I think what your keen eye is seeing is related to the difference between the way your TV produces and image and the way your computer monitor does.

A TV, as I'm sure you know, writes every other line of pixels and then goes back and does every other line 60 times a second so that, 30 times every second, you get a frame of video.

Your computer, on the other hand, writes however many lines of pixels in one swipe. (Help me, techies. There's a word for this.) So, when you watch DV or DVDs or whatever on your computer, the software has to compensate for this.

That's a lousy explanation, but I think that's probably the issue.

Steve Grisetti wrote on 5/20/2004, 12:49 PM
Aha! I found it! It's called interlacing. TVs do it. Computer monitors do not.

Here's a great discussion on how computers convert interlaced media, such as NTSC-DV:
http://www.hut.fi/Misc/Electronics/circuits/vga2tv/computer_tv.html
rdwhitehill wrote on 5/20/2004, 4:40 PM
Hi,
I'll check out your site your reference (thanks)... but I wanted to get back to you first...
Interlacing; I'm right with you. Here's something to find out about...?
I wonder what the quality/compatibility difference is between rendering with "lower field first" vs. "upper field first" vs. "progressive (aka deinterlaced)" as far as the Field Order setting is concerned? Has anyone played with these three options in the custom video rendering settings? Does one seem to produce a sharper render than the others?
Anyway, thanks for jumping in the pool with this subject. Anyone else care to swim? <smile>
Doug
rdwhitehill wrote on 7/5/2004, 5:59 PM
Lower field vs. upper field first...
Found an answer to my own question <gg>: Lower-field first is required to be compatible with the NTSC standard.
So if any video is going to be rendered out for use in a device using NTSC video standards (e.g., televisions), then "lower-field first" must be used.
Piece by piece, we all learn together!
Doug W.