720 X 540 --> 720 X 480?

studioman3000 wrote on 5/13/2003, 6:09 PM
Why is it I have to fly my 720 X 480 projects into a 720 X 540 platform only to re-render it into a 720 X 480 (stretch to fill output frame) so it doesn't stretch on DVD? I know I'm losing resolution and information. Is there a way around it? I'd like to keep my resolution at a specific area instead of squishing and stretching.

Of course, all of this is just due to lack of experience and knowledge. Bear with me, please...

Comments

Former user wrote on 5/13/2003, 9:56 PM
What do you mean by "fly my 720 x 480 projects into a 720 x540 platform"?

What is the origin of the project? What is the 720 x 540 platform?

Dave T2
studioman3000 wrote on 5/13/2003, 11:05 PM
I do (did) all my projects in a 720 X 480 format (NTSC DV). When I made the DVD, it stretched the output vertically. I had noticed this happening in full screen modes of some media player programs but thought nothing of it.

After searching the web endlessly, I finally ran across a page that suggested importing a 720 X 480 into a 720 X 540 (in other words, just change the project properties) and then re-rendering it into a 720 X 480 format and click the "stretch to fill output frame" which, in essence, squishes it into the 480 only to be stretched again back to a normal aspect ratio.

But there's gotta be mega-resolution loss happening there. At least, in theory. I'm still having trouble getting my DVDs to play properly so it all remains to be evaluated completely. I'm just trying to solve all of these issues. All related to a certain lack of experience, no doubt.
pb wrote on 5/14/2003, 1:35 AM
Are you trying to get the letter boxed wide screen effect? We make DVDs daily (boring corporate and industrial training and safety material), edit then render as standard 720X480 (thus gerating the letterbox) before routing the signal to the Canopus box for MPEG2 encodoing. Why 16:9? My preference and after over two decades of shooting I reckon I have the right to some artisitic license. Looks nicer on the projectors as well.

Peter
Former user wrote on 5/14/2003, 8:11 AM
480 is normal for TV. It does not get stretched to fill the screen. IF anything the 720 gets squished, but not really because it is related to square vs. non-square pixels and such.

The only time you should need to use 720 x 540 is creating graphics in software such as Photoshop. This is a WYSIWYG kind of thing that allows you to create Round circles and square squares in your finall product. If you create your graphic as 720 x 480, when the 720 gets aspected, the circle is not round.

I don't make DVD's yet, but I think someone else might chime in with why you are getting stretched pictures. It might have to do with how your DVD player is setup, as opposed to the file properties.

Dave T2
mikkie wrote on 5/14/2003, 10:03 AM
FWIW, the 720 x 540 dimension to my knowledge is what the average, comercial, full screen (4:3) DVD video gets displayed at (or close anyway), although the original (non-anamorphic) size is still 720 a 480. Could it be your problem resides somewhere in that end of things, with whatever DVD authoring prog., templates etc.? Need more info there so someone can provide more specific help.

Trivia: TV picture tubes used to vary widely - more then today even, and that originated the overscan/safe area stuff as far as I know & have read. The signal to a TV, rather then being a more perfect 4:3 640 x 480, is larger on both dimensions to allow for how much of the picture tube is actually unmasked by the TV cabinet. The width is more critical, where you'll more likely see prob., so less focus on the 480 dimension. Still, specs are specs so with NTSC at least this stuff hangs around.

studioman3000 wrote on 5/14/2003, 2:27 PM
I wish it was just the player or the TV - but it proved constant from one unit to the next. It's especially noticeable in those DVD-media player programs (WinDVD - etc...). I import the files into DVDA as 720 X 480, but it will inevitably get stretched (or squished - whichever your preference). No one else has this problem? I mean, when you do it how I've been doing it to get it right, it's not immediately noticeable on a TV, but it also depends on the quality of the decoder to hide that loss. Either way - I'm more interested in creating something and losing as little information as possible. That's my goal. If I shoot at 720 X 480, I don't want to have to change the screen area at all. I want every pixel intact from beginning to end.

When I record audio, I don't record stuff at 48kHz, resample to 44.1kHz only to resample it back to 48k when I encode it to AC3. It just doesn't make sense.

And I happen to like the wide-screen look, too. Very sleek. But, I'll work with any of it.

Thanks, y'all for being as resposive as you are. Most video forums suck.
studioman3000 wrote on 5/14/2003, 2:34 PM
Right. I need to take one of the ones I burned in that format and double check - I may have jumped the gun - but I don't think so. I'm relatively certain that I did that. No harm in double checking. I just wish my piece of junk over-priced 5-disc DVD player would read burned discs. What a crock. I think a new one will be arriving before too long, here...
seibu1 wrote on 5/14/2003, 7:02 PM
This is obviously some sort of square vs. rectangular pixel problem.

DV is 720x480, but D1 uncompressed video is 720x486. When you're using a program like photoshop you create your comp at 720x540 (for DV) for WYSIWYG but then resize it to 720x480 when you ship it off to your rectangular pixel app. For straight video projects you should never have to change it, only when creating graphics.

It could be that WinDVD is displaying square pixels for some reason.

Seibu
studioman3000 wrote on 5/14/2003, 9:19 PM
Apparently I have much research to do. Yes, I will soon be searching the internet or Radio Shack ($59!) for a DVD-R compatible player.

...by the way, I plan on winning the contest. :p
rmack350 wrote on 5/15/2003, 12:24 AM
This really sounds like an pixel aspect ratio issue. You absolutley should not have to jump through these hoops. Either your player is doing something funny to the file or you are making a mis-step somewhere along the way.

Your project settings should be ntsc DV, 0.9091 pixel aspect ratio. If you render in vegas you should set the mpeg2 template to NTSC DVD. Should work.

Your player may screw things up. The mpeg file won't have square pixels and your player might possible display the file wrong. Anything could be going on there.

*********

Pixel aspect ratio is a confusing thing. So is the frame aspect ratio and I'll start there.

The first thing you should be aware of is that neither 720x540 nor 640x480 are proper representations of a DV video frame. Sorry.

You'll have to do some hunting around for the supporting math but here's the deal. A video signal is an analog signal at varying voltages. The picture is made up of signal lines (rasters) which must diminish to 0 volts at the end of each line in order for the beam to be postioned at the beginning of the next line. There the voltage is brought back up to signal level. Analog equipment doesn't do this voltage change instantly so there is inevitably an area at the left and right edges of frame where the raster quickly fades up or down. You can imagine that the first TVs were pretty slow at this and needed a lot of extra space on the raster to come up to voltage.

So, for any video signal there is an area at the center that is expected to show on a TV screen and there is additional space on either side that is not expected to be seen.

The area that is expected to be seen is the area that has the 4:3 aspect ratio we talk about.

(BTW, If you capture video from your VHS deck through your DV camera, deck, or converter, you will see the black at the right and left edges of the signal.)

When this analog signal is digitized it is done so at a rate that yields 720 samples across the video area. 704 of those make up the area that is expected to be seen in that 4:3 aspect ratio. The rest of the pixels are at either side of the image. Depending on the adjustment of the TV you might see some of these on one side or the other.

Now, as it turns out, this sampling rate shows as a wide-stretched image on a computer screen. To simulate an image with a correct aspect ratio you need to squeeze it to 654.5 px wide. (you could also stretch the height to 528PX high)

We say that a tv has non-square pixels and a computer screen has square pixels. This isn't really true but it's a model that is easy to get your head around. A TV has a continous analog signal. Think of it more like this: DV records a signal at a specific sample rate, a computer displays it at a slightly slower sample rate and so the image appears wide.

When we render a video file we have to tag it with a pixel aspect ratio in order for the player to interpret it correctly. If we render a 720x480 movie to MPEG2 and say that it has square pixels when it shouldn't it's likely that the player would stretch the image in one direction.

So why do some NLE's insist that stills be corrected from 720x480 to 720x540 (instead of 720x528)? Part of it, I think, is best practices. The other part is bad math.

The logic of starting with a 720x528 still and then squeezing it to 720x480 is that you are just throw away some pixels. If you went from 654x480 to 720x480 you'd be creating pixels-essentially they'd be averages of other pixels. In principle, it's better to throw away pixels. In practice it probably doesn't matter much. Vegas opts to create pixels to bring stills up to size.

The reason most apps say that 720x540 is the right size is that, for some reason, they started by assuming that the whole DV frame should be a 4:3 image when corrected. They started with a result and then did the math to get the result they wanted. It's creation science in video. Each NLE vendor seems to have coppied the mistakes of it's neighbors.

In a way, it doesn't matter as long as you are consistent. IF your NLE says to make the still at 720x540 then that's what you do and it'll work just fine. It WILL matter in vegas because the SoFo programmers were a bit more rigorous in their math. A 720x540 still will not fill the frame in vegas. A 720x528 still would work fine. Vegas prefers a default of 654x480 though.

BTW, if you scale down a 720x540 image to 704PX the height is 528. Not a coincidence.

Rob Mack
studioman3000 wrote on 5/15/2003, 2:38 AM
Very interesting. A slower sample rate appears wide? I feel like a freakin' newborn. Here I've been doing audio for what seems like eons (is more like decade) and I've got my brain wrapped around some groovy DSP, calculus, and Fourier Transforms - and Now I'm trying to solve the area of a rectangle.

I should get some sleep...
JackHughs wrote on 5/15/2003, 12:31 PM
Video aspect ratios are enough to drive anyone around the bend - particularly if you want to do the math with precision.

Here's the really simple version.

First, the term "aspect ratio" is properly used to describe either an image or a pixel. The term "aspect ratio" is not properly used to describe a pixel array.

Imagine a rectangle with a horizontal dimension of four feet and a vertical dimension of three feet. This represents an "image" with a 4/3 aspect ratio.

Now, take a group of one-foot by one-foot squares - like big floor tiles. These tiles represent "square" pixels (an aspect ratio of 1.0). To fill the rectangle, you will need three rows of four tiles. Using these square pixels, the aspect ratio of the image and the horizontal/vertical ratio of your "pixel array" will both be 4/3 (1.333). Note that I did not use the term "aspect ratio" to describe the pixel array.

Finally, take another group of floor tiles. This time the vertical dimension of each tile remains one foot but the horizontal dimension is reduced to six inches. These tiles represent "non-square" pixels with an aspect ratio of .5. To fill the rectangle with these tiles, you will need three rows of eight tiles. The aspect ratio of the image remains 4/3 (1.333) but the horizontal/vertical ratio of the pixel array is now 8/3 (2.666).

The horizontal/vertical ratio of a DV pixel array is 720/480 (1.5). However, the aspect ratio of a DV "image" remains 4/3. Like the smaller floor tiles, the horizontal dimension of a DV pixel is less than the vertical dimension. If the vertical dimension of a DV pixel is assumed to be 1, the horizontal dimension calculates out to approximately .909.

So, an array of 720 x 480 DV pixels will produce an image with a real aspect ratio of 1.333 - more or less. Like I said, this is the practical math, not the mind-numbing rigorous stuff.



JackHughs
riredale wrote on 5/15/2003, 1:09 PM
studioman: Interesting topic, yes?

I guess it just boils down to this: since you are building your DVD menus and such on a PC (which displays images in square pixels), make sure your source stuff has a 4:3 aspect ratio--whether that means it measures 640x480 or 720x540 doesn't matter that much. Vegas will take care of converting it into the DV format. Incidentally, the "DV" format is NOT 4:3. It's a bit wider than that--4.09:3, or 1.36. That's why you see the tiny black bars on the left and right sides when you import a 4:3 picture into Vegas.
mikkie wrote on 5/15/2003, 3:43 PM
Pixel aspect ratios are a tool of the Devil! [Just teasing!]

Might want to explore adamwilt.com & possible search out DV.com for a feature some time back re: 601 video (it's about color conversion but explains some of the voltage stuff quite nicely).

Pixel aspect ratios actually exist according to stuff I've read (I've no way of cutting a tube apart & verifying) due to the way that the phosphors are (or were) laid on the inside of the picture tube - 3 more or less dots laid out in line being a rectangle. That might be more along the lines of how it originally developed, I think depends on the shadow mask, & it does blend with aspect ratios as a topic - Microsoft included the ability to set aspect ratios in winmedia 9 and talks about it as a way to produce anamorphic video. As Rob notes, maybe pixel aspect ratio is now supposed to be an easier way of thinking about the total aspect ratios, or maybe they just kind of morphed together?

In essence, if you take your video and leave the aspect ratio alone, you're most likely gonna be cool. If you leave your DV at ~0.9, edit it, and print to tape or render to mpg2 for DVD at the same ~ 0.9 you've not changed anything, made it wider or narrower. If you change it in Vegas by altering the clip properties, your picture will change - if you change your project setting to square pixel, and uncheck maintain aspect ratio, things look exactly the same. Rendered with/at ~ .9 throughout, a square's still a square, & rendered set to square pixels (without the aspect ratio locked), a square's a square.

In many (most? - All?) cases the pixel aspect is just there to conform with other hardware and software that work with older stuff and specs that may seem archaic. It certainly doesn't offer anything to the person who tapes his kids and plays it back on the VCR! Doesn't mean that a picture on your PC will show up on your TV with aunt Marge looking fat and beating you over the head for it either.

Import a still into VV4c with the proj set to the DV template... Set the still to square pixels, & render to DV, and the result (with aspect locked) is a wider picture as Vegas I guess thinks that the picture is natively wider then any DV video, and should be displayed that way. Tell Vegas that the picture is already DV (ie: set clip prop. to DV aspect instead of square pixel), and render - the result appears the same as the original. That explains it as best anything I could come up with - where possible keep your settings the same, from clip to proj to render. And it's something anyone can try with a few moments of time on their PC and display as usual to their TV monitor.

Now the picture I just used originated on my PC, so I consider it correct, not my TV monitor. I didn't change the picture. I told Vegas that I wanted to encode the picture and that it was currently at the correct pixel aspect ratio. The software (in this case wmplayer) wouldn't need to know anything to display the picture properly if I didn't play with the aspect ratio, and many avi formats don't recognize/record this info - It only became necessary when I screwed things up artificially by adding in that .9 to the mix. This is something just as easy to test out with any still & VV4c.

This aspect is needed of course going back out to your DV camera or box etc., but my point is, if they didn't include that stupid spec, if they left it at 1:1, then your camera would deal with it just as nicely as pie. But Does the TV need it? Folks display &/or capture analog all day long with square pixels, ATI just came out with their new AIW for a few hundred dollars -> Can't imagine this happening if the video was squished, distorted - can you?

Simply (FWIW & IMO and all that) - if you have a device or software expecting a .9 something pixel aspect, or anything other then square, then you have to output to match or else that device will distort your video. If you never encounter such an animal, then you can happily remain ignorant of the whole thing, and just make sure to always set square. What about DVDs? To my knowledge and you can try this as well, setting the mpg2 box to .9 means no resizing and less work if your source is .9 already, and the same for square pixel - however some players may expect this to be non-square.

The various import size stuff for stills is somewhat confusing. Import a full frame still (720 x 480), and as above, set it to the DV pixel aspect. It fills the frame. Export it and the still will be 654 x 480, because Vegas resized the picture in the preview box, will resize the picture when rendered to .9, will appear correct when/if viewed with software that understands the .9 flag - to stretch things out to normal.

Why so much worry?
studioman3000 wrote on 5/15/2003, 5:15 PM
Yeah. OK. So now I understand that computers have sqare pixels and (I looked) TV's have little RGB striped rectangles. HA! I'm on my way....

...to work. Back in the morning.
rmack350 wrote on 5/16/2003, 12:55 AM
Yeah, I'd better rephrase that. Was I tired? I complicated something that should be simpler.

It's not really true that a computer CRT displays at a slower sample rate. That's all hogwash to try to draw up a model to wrap around this. It's a bad model.

The thing is that a video signal isn't pixels. It's a series of rasters or lines. You could sample these lines at all sorts of rates and you could get a digital image that is 10x480 or 100x480 or a million by 480. Obviously at any of these sample rates you'd have to have some sort of software or hardware interpretation to view it on the screen. 654x480 would give you the best representation of the entire picture. So would any multiple of it so long as the proportions stay the same.

Now, as for 720x480 non square pixel movie playing full screen in windows media player... It shouldn't fill the screen. Remember that the first 4 common screen resolutions are 4:3. (1280x1024 isn't 4:3). And remember that a full DV frame from edge to edge is NOT 4:3. Never will be. The 4:3 area is 704x480. You could crop the file to 704x480 and render it as such. It should fill the screen.

The remaining 16 pixels are the rest of the image where an analog signal MIGHT be ramping down to 0 volts before moving to the next line (or ramping up after it gets to the next line). I say "might" because some analog equipment has to start earlier and finish later because it's slow. All analog equipment will ramp the signal down at the edges.

DV captures ALL of the image-the "4:3" area plus the rest of the image where this ramping is allowed to occur.

Last year it seemed that Windows Media player couldn't tell square pixels from non. Recent versions seem to do a great job of it. I don't see a problem.

Rob Mack


rmack350 wrote on 5/16/2003, 1:16 AM
Two points.

First-yes the tv does look like it has little square pixels. So is this the phosphors on the screen or an aperture grill or what? I honestly don't know the answer but I can tell you that the signal is a wave-not pixels. And the signal is what gets sampled, not the glass part on the front of the tube.

Second, the entire picture is not 4:3. Only the window of the picture that you get to see. There's extra there on either side that is allowed to have picture. It's in those side areas that (especially 1950's TV sets) the signal is allowed to fade to zero volts as fast as it can (never fast enough) before the beam tracks to the beginning of the next line and fades back up.

To get a 4:3 image from DV you have to crop it to 704x480. But why would you do that? You might do it if your end destination is a projector because projectors don't mask the edges. If the projector is fed from an analog device (say you printed your project onto VHS) then it would show crummy edges unless you had cropped it ahead of time.

Rob Mack
mikkie wrote on 5/16/2003, 9:08 AM
"First-yes the TV does look like it has little square pixels. So is this the phosphors on the screen or an aperture grill or what? I honestly don't know the answer but I can tell you that the signal is a wave-not pixels. And the signal is what gets sampled, not the glass part on the front of the tube."

FWIW: The phosphors are laid down, and various types/designs of masks are hung so that the stream of electrons hitting the phosphors (to make them glow) doesn't spread out as much, hitting the neighboring phosphor groups - same way a PC CRT works. The Sony triniton tube won a lot of acclaim for their mask design - I've heard of some PC monitors where someone with the right eyesight can see the wires they use to suspend the masks, and for those that can see it, drives them nuts. But, just trivia...

I have no idea exactly how much a so-called non-square pixel effects the display now days with modern TVs versus 10 - 20 years ago; I have read that it was much more an issue using the Amiga systems, & I do suspect that it was much more an issue in years (decades?) past trying to get a comparable display on a PC and on a TV. But that's just pure pixel aspect ratio, not the video picture as it's talked about (too often) today. If a picture on your PC was square, in theory lighting the same number of pixels on a TV screen or Amiga monitor would produce a rectangle rather then a square if the [group of phosphors called] pixels were different shapes then on your PC monitor.

I think one reason it gets so confusing is that there really is no such thing as a pixel in the real world, something you can hold onto like a marble in the palm of your hand. Say you're looking at a picture on your PC's monitor, and you change your monitor's display to 640 x 480. If you zoom in on the picture (doesn't have to be much) you'll see these huge, hopefully square, blocks that *represent* the individual pixels making up that image. Change your monitor display to 1024 x 768, or 1280 or... You can still zoom to the individual pixel level, but the blocks are much, much smaller.

- A pixel is actually just a relative unit for measurement, in that way not unlike db. -

Remember when you installed a new graphics card or drivers, or got a new monitor for your PC? Probably one of those times you had these big black bands at the edges of your display until you adjusted the monitor. Your graphics card was sending out the right signal, perhaps for a display size of 1034 x 768, but you adjust your monitor so that those streams of electrons hitting the back of the screen reached much closer to (or beyond) the visible edges of the picture tube or CRT.

Another example is dot pitch, referring to the spacing of the holes in the mask I talked about up above, the holes the streams of electrons pass through on their way to lighting your screen. You'll find this spec (sometimes with difficulty) for every monitor - not pixels - no mention of pixels. You can have a large monitor, but with a higher dot pitch the actual display is terrible. [trivia: the 1st personal computers did use your TV for display, but the dot pitch was so high they were almost unusable for text - if you're old enough, remember the old pong-type games?]

So a pixel is sort of a fiction, maybe better thought of as a block of clay rather then anything like a mm equaling a mm no matter which ruler you use. If you want you sculpt the clay into a rectangle, or maybe a square. If you have video with an ~ .9 pixel aspect, then you're taking all the blocks of colored clay that make up the picture, and squishing them into taller rectangles rather then squares. The software still tracks them, in theory they're all still there, but your picture is not as wide. Play that video in wmplayer or whatever, & it reads from the file all sorts of info including the part where each pixel/block of clay is squished - and if it does it's job properly, unsquishes them, stretches the picture back out to normal.

Computer code can only write files that are so small before tossing out data - it a picture contains 300 pixels that file will be larger then if the picture only contained 100 pixels, the same way a text file increases in size for every letter typed. So, if you can store the data for a wide screen video frame in the same space as a standard, more 4:3 size, you'll save a ton of room. Easiest way to do that? Make each block of clay much wider. When you're after max compression, as with wmv or mpg2, this can help, so pixel aspect ratio suddenly becomes a shortcut for shrinking/expanding the video frame without adding or subtracting the number of pixels you have to track.

-That's where it gets more confusing.-

If you can, when it's possible, you really don't want to add or subtract pixels on your own to conform to whatever pixel aspect standards because that's actually less correct, not to mention more work and we're all over worked.

If you import a 720 x 480 image into the Vegas timeline, and you're proj is at the ~.9 pixel aspect, the picture doesn't fill the frame because all it's pixels, the little blocks of clay are still square. You can take some of them away if you want, and Vegas will fill the frame for you, and when it renders your file come up with (create) the greater number of squished pixels it needs to represent the picture properly in spec. Or you can tell Vegas that you want all these pixels included in the picture, by telling it to squish all the current blocks of clay so it doesn't have to make new ones. You can do this by setting the images properties to the same ~.9 aspect. Less work and more accurate, as any software can only guess what color pixels to add.

Alas, the opposite is not always true, or may not appear so anyway. Export a picture from a .9 aspect video frame and you don't get a picture out with x number of squished pixels. Handling images, windows doesn't know what squished pixels are, so you may (or may not) benefit by playing with things a bit before you export a still from Vegas. [This *might* have an impact on things if you're going to a series of stills for P/Shop etc., whether it's best to go with a scripted copy command or convert to mjpg avi & use another app. &/or if you need to change things a bit before using the script. Just thought of this so don't have any supporting data.]

If you take your DV clip and play it in wmplayer or similar, the picture should be displayed the same way it was created, ie: people no thinner nor fatter. A 704 wide frame should ideally take up 704 pixels, 640 = 640 and so on. Use whatever hardware to output your PC monitor display to a TV monitor, and what happens depends to a large degree on the hardware. Some, like my older ADS convertor, will automatically upscale the picture while converting it to the 29.97i the TV likes. An ATI card might not, depending on which one. Going out firewire, kind of doesn't matter as the DV device wants a .9 picture, and knows how to convert it to the proper signal for your TV - feed it a 1:1 picture and it'll likely treat it as .9, spreading things out a bit, as it only knows to unsquish each block of clay.

This may not be strictly accurate in TV broadcast terms, but then they do not natively incorporate PCs and their pixels and so on. In the pro world, whether video or audio, a lot of stuff has hung on that wasn't retained with newer fields or consumer/pro-sumer/digital equipment. With broadcast, some of this is because the specs are law, and those laws are still in effect (broadcast legal means just that). In other cases, well, how many decades ago was it that the US was going to switch to metric?

Hope this might help a wee bit...
mike
JackHughs wrote on 5/16/2003, 11:26 AM
Mikkie has it right. Pixels are not physical. Rather they are numerical representations (samples) of a physical image.

Pixels are not literally "square" or not square. These terms are a form of shorthand to describe sampling rates.

Take the example I presented earlier - a four-foot by three-foot "image" populated first by an array of 12 square tiles and next by an array of 24 non-square tiles.

In the first example, the "sample rate" on the horizontal axis was equal to the sample rate on the vertical axis. Because the ratio of the two sample rates is 1, we can say that each pixel is "square" (pixel aspect ratio of 1).

In the second example, the sample rate on the horizontal axis is twice that of the vertical axis. Because the ratio of the vertical sample rate to the horizontal sample rate is .5, each resultant pixel is mathematically "non square".

So, how do we calculate "pixel aspect ratio"?

For the sake of simplicity, assume that DV samples a 4 by 3 image 712 times horizontally and 486 times vertically. The aspect ratio of the image is fixed at 4/3 (1.333). The ratio of the horizontal sample rate to the vertical sample rate is 712/486 (1.465). The ratio of the image aspect ratio to the sample rate ratio is 1.333/1.465 (.909).

So, at the end of all this, all I'm trying to say is that your computer stores each image as an array of numbers (pixels). Pixel aspect ratio is only a convenient way of relating the aspect ratio of the image to the dimensions of the array.

"Sample rates" in analog TV are determined by the number of scan lines and the velocity of the electron beam as it sweeps the face of the CRT. While the size and spacing of the phosphor dots on th screen determine the ability of the CRT to resolve the information sent via the electron beam, these factors have nothing to do with the squareness of pixels.

JackHughs
studioman3000 wrote on 5/17/2003, 4:09 AM
Right. I never really thought of pixels as entities in the traditional sense. And I prefer to cram as many in one small space as is humanly possible. But isn't my camera (digital) recording (or storing) the info as sqare pixels? If that's the case - I'll work from there. Then it's just a matter of tracking and maintaining. I think...
mikkie wrote on 5/17/2003, 11:04 AM
"Then it's just a matter of tracking and maintaining. I think... "

Exactly.

Your DV camera records a bunch of images, data it gets & processes from the ccd(s). It records this on tape, to the DV spec which as a sort of "by the way" includes a pixel aspect of ~.9 something. IF you don't change anything, you're cool.

So you track it, monitor it through whatever editing process(s), and if/when anything changes, put it back or adapt accordingly.