I am just now getting into BluRay burning. I'm very familiar with all the resolutions of HD, bit speed, etc. But I'd like some clarification on which ones to use.
I do all of my filming and editing in HDV 1080x1440 60i. I understand that 1440x1080 doesn't mathematically work out to 16:9, but it accomplishes 16:9 by using rectangular pixels. 1920x1080 is true square pixel 16:9.
The MainConcept BluRay renderer in Vegas offers both 1440x1080 and 1920x1080 rendering. And DVDA defaults to 1920x1080 when I select BluRay format for a disc project. My question is, if I'm working in Vegas in 1440, should I go ahead and render it to 1920 to match what DVDA (and BluRay?) apparently expects? Or should I change the setting in DVDA to 1440? (I'm trying to avoid DVDA going back in and re-rendering everything). I figure disc capacity is going to enter into the discussion somewhere here. But I'm not close to the BluRay max size. So that is not an issue at this time. I really just want to get the absolute best picture possible. But is it a waste of time to try to up-convert 1440 to 1920? Or worse, if I change DVDA to 1440, is that going to increase incompatibilities with BluRay players? It just seems curious that the default for HDV is 1440 in most cameras, but DVDA defaults to 1920.
Secondly... the MainConcept renderer in Vegas offers BluRay render templates at 8Mbps and 25Mbps. Yet DVDA defaults BluRay to 18Mbps. I'd like to get the better quality rendering in Vegas at 25. But not if DVDA is going to re-render and knock it back down to 18Mbps. I figure I can change DVDA to be 25. But at this point, I'm assuming there were reasons it was set to 18 default. And I don't want to start messing with things without having a bit more background.
Summary... just curious why Vegas and DVDA seem to have different ideas on defaults. Recommendations on 1440 vs. 1920 and 25Mpbs vs. 18Mbps for rendering in Vegas for BluRay?
Thx.
2G
I do all of my filming and editing in HDV 1080x1440 60i. I understand that 1440x1080 doesn't mathematically work out to 16:9, but it accomplishes 16:9 by using rectangular pixels. 1920x1080 is true square pixel 16:9.
The MainConcept BluRay renderer in Vegas offers both 1440x1080 and 1920x1080 rendering. And DVDA defaults to 1920x1080 when I select BluRay format for a disc project. My question is, if I'm working in Vegas in 1440, should I go ahead and render it to 1920 to match what DVDA (and BluRay?) apparently expects? Or should I change the setting in DVDA to 1440? (I'm trying to avoid DVDA going back in and re-rendering everything). I figure disc capacity is going to enter into the discussion somewhere here. But I'm not close to the BluRay max size. So that is not an issue at this time. I really just want to get the absolute best picture possible. But is it a waste of time to try to up-convert 1440 to 1920? Or worse, if I change DVDA to 1440, is that going to increase incompatibilities with BluRay players? It just seems curious that the default for HDV is 1440 in most cameras, but DVDA defaults to 1920.
Secondly... the MainConcept renderer in Vegas offers BluRay render templates at 8Mbps and 25Mbps. Yet DVDA defaults BluRay to 18Mbps. I'd like to get the better quality rendering in Vegas at 25. But not if DVDA is going to re-render and knock it back down to 18Mbps. I figure I can change DVDA to be 25. But at this point, I'm assuming there were reasons it was set to 18 default. And I don't want to start messing with things without having a bit more background.
Summary... just curious why Vegas and DVDA seem to have different ideas on defaults. Recommendations on 1440 vs. 1920 and 25Mpbs vs. 18Mbps for rendering in Vegas for BluRay?
Thx.
2G