Hi guys!
I really need your help, with a very frustrating problem. I shoot with XDCAM EX1R, format 1080p/25. I edit in Vegas Pro, which renders two files: m2v video file and ac3 audio file (5.1ch). I import them into DVD Architect, which says "no rerendering necessary".
I've taken a couple of photos, to illustrate my problem.
First photo is of my movie shown on the computer screen (preview in Vegas)::
https://docs.google.com/open?id=0B2LZ75_4uCUQLS1ZcVNBam9XTGM
There is no aliasing problem (the moire comes from the bad cell phone camera...).
The second photo is of the same snapshot from my movie, shown on a TV set (FullHD Sony TV):
https://docs.google.com/open?id=0B2LZ75_4uCUQZnFtNVZ4VXA2MFU
There are obvious aliasing problems, as I understand it. Around the wires, you can see a major glare / halo effect, and moving objects with many details (or when panning high detail scenes), the details "vibrate"/flicker a lot!
The movie was shot in 1080p/25. Vegas project settings:
https://docs.google.com/open?id=0B2LZ75_4uCUQRHo5eGlBYVF5aGM
Rendering settings:
https://docs.google.com/open?id=0B2LZ75_4uCUQdEN0OXJGWEx0aVk
I have tried rendering as progressive (which I usually do), but no difference.
Since this is a MPEG2-file from Vegas (Main Concept plug-in), it is not possible to include a multi channel (5.1) audio file, so the 5.1 AC3 audio file must be rendered separately, and the two files merged in DVD Architect, which always outputs interlaced video format (I assume?). Here is where the problem occurs, or what do you guys think?
Rendering a MP4 file in Vegas, is also no option, since a multichannel audio file cannot be included. Right?
If I play the movie on the computer (Windows mediaplayer), using two screens: 1) PC monitor, and 2) TV (HDMI connected), the PC monitor displays the movie without any artifacts at all, whereas the TV shows this "halo effect", no matter how I adjust the TV (FullHD TV).
The difference I can think of, being the fact that the PC monitor can display 1080p, whereas the TV can only display 1080i... Am I right? Where is the interlacing done? In the display card or in the TV?
I thought f.ex. BBC broadcasted 1080p on satellite, and there is no artifacts at all watching those HD channels. Maybe the computer display card (and media player) outputs only interlaced to TV screens? Why? I'm really not sure they do that...
Moreover, since the DVD architect packed this .m2v video file and .ac3 audio file (both from vegas) into one iso file (the unpacked to one .m2ts file) and played back on media player/computer, maybe DVD architect outputs interlaced video, since the bly-ray standard has no 1080p/25 option? If so, why does this interlaced output display correctly on a PC monitor, and with halo effects and "vibrating/aliasing" effects on the TV?
I am so satisfied with all other parts of my work flow, from shooting to editing, but when the end product displayed on TV suffers from quality issues, it's extremely annoying...
So, what should I do?
Any help is highly appreciated!! :)
BR Dag Halvorsen