Hello to everyone smarter than I -- this means you!
Longtime, mostly-casual/hobbyist Vegas user, recently commissioned for a project and finding himself older, but not wiser...
I'm using Vegas (recently upgraded from VP12 to Vegas 19) and DVD Architect 6 to create my 4th 'commercial' DVD. I've been watching a lot of Youtube videos to become acquainted with new features (and new problems!) in Vegas 19, and have run across some contradictory claims that leave me wondering if this newfangled internet thing is all it's cracked up to be... 😉
I've been concentrating on getting the best performance possible out of my aging system, in terms of timeline performance and render speeds. This, of course, gets into the switches and settings involved with GPU vs CPU rendering and video ram allocation, etc. While I'm not fully cognizant of the best practices in this area, I'm relieved that I can obtain reasonable performance with my relatively unsophisticated projects, and Vegas doesn't crash & burn too often.
I saw a seemingly knowledgeable user's advice on Preview Window settings and their effect on files rendered from Vegas, and I didn't 'get' the connection. The implication was that the video setting in the preview window directly affects the quality of the rendered video file. Worse, it led me to (almost) believe that I had to select one setting in the preview window -- say Draft, Half -- to get decent playback of the timeline, but then I'd have to remember to set it to something more resource-intensive -- Best, Full, for example -- to achieve superior quality when I render to a file. This doesn't make sense to me. Is there really such a relationship between the two?
I seem to remember something about this (maybe) affecting the resolution of a captured frame saved to an image file in an older version of Vegas, but is it applicable to video rendering, as well?
Thanks in advance to anyone who can shed some authoritative light on this!