Blowing up Picture and Video

MadMaverick wrote on 12/1/2015, 12:19 AM
First of all, what is everyone's thoughts on using Sony Vegas as a Photo editor? I've been kinda using that as my Adobe Photoshop substitute.

If I save a still from 720x480 footage (by clicking the little floppy disc icon above the display in Vegas that reads: Save Snapshot to File), for some reason the still will end up having dimensions of 655x480.

When I save a still from 1440x1080 footage I get a still that is 1920x1080.

655x480 makes for a picture that's kinda small. If I was to blow it up to 1440x1080 (by changing the Project Video Properties) the picture is of course gonna be bigger, but would I be losing and distorting the quality by doing this?

This reminds me of how people will "upscale" 4:3 Standard Def video by rendering it out at 1440x1080... are people losing quality by doing this? Is this even worth doing? On YouTube for example, it might not look so bad while viewing under the full 1080 HD setting, but on 480p it's bound to not look so great.

I notice that Blu-rays and HD TV channels are like this as well with 4:3 content (such as old TV shows). The black bars are just part of the picture, and people have no choice but to watch it in the proper ratio... unlike with most DVDs or Standard TV channels where people have the option to watch it stretched out or with the edges cut off in order to fill their HD TV screen.

If one was to produce Blu-rays (or even DVDs) would it be wise to upscale them and have them formatted with the black bars on the sides? The only thing is, on square TVs your content is gonna look boxed in. I guess this shouldn't be a concern since most people don't use square TVs anymore.


Johnkl wrote on 12/1/2015, 2:46 AM
to the best of my knowledge, the size of the picture you are saving with the little disk-icon depends on the size/resolution of your windows. Try to select "best-full" and see if the size of the pic changes
TheHappyFriar wrote on 12/1/2015, 5:42 AM
Yup, using the little floppy disk saves based on the previous settings. Under the preview display, look for the "preview" resolution/settings.

The preview settings can never be higher then the project settings.

The reason you got an odd-size for the photo (not 720x480) is because images are a 1.0 PAR & video can vary. It translates the DV image in to a still PAR. That's also why your HDV settings save it in 1920x1080, to save in a 1.0 PAR.
rs170a wrote on 12/1/2015, 6:00 AM
From the Vegas Pro 9 manual:

* Copy Snapshot to Clipboard and Save Snapshot to File now deinterlace interlaced images and operate at Best (Full) resolution if the Video Preview is set to (Auto) size and Good quality if set to Draft or Preview.
* Copy Snapshot to Clipboard now performs pixel-aspect correction like the Save Snapshot to File command.

Arthur.S wrote on 12/1/2015, 9:25 AM
There are many scripts out there for 'photo snapshot' or whatever. I use Vegasaur for this now myself.
musicvid10 wrote on 12/1/2015, 4:33 PM
Uprezzing is always better done by a hardware player, not software resampling.
VMP wrote on 12/1/2015, 8:07 PM
Musicvid, why does a hardware player do better scaling than software?
Doesn't the hardware player have software too?

Chienworks wrote on 12/2/2015, 8:17 AM
Hardware players have dedicated circuitry designed to process video images and little else. Therefore they can make cheap players that excel at this process. Many of the calculations necessary for good resizing are executed at the microcode level without software being run by a CPU. Microcode can execute an arbitrarily high number of operations in each clock cycle since the instructions are already there, in the silicon. This makes them extremely fast and then the algorithms can be very complex while still keeping up with the stream of data.

A general purpose computer must be able to handle any task and have more generalized microcode which is not set up to be efficient at any particular task. While they can be programmed to do anything, these instructions must be loaded into the CPU from memory and executed, which means that many operations can take a full clock cycle or even several cycles to complete. It's also limited to having only one instruction per core running at the same time. "Software" is much less efficient than microcode.

Now, this is assuming real-time processing. One could write software that does in fact use the same complexity of resizing algorithms as the microcode used in hardware players, but then rendering time might balloon into rather huge proportions. You can already experience this to a small degree in Vegas as most of the customizable rendering options have a setting for "fast / medium / best". This doesn't change the encoding used, but it does swap in more complex algorithms that spend more time optimizing the image for the best encoding possible. In most cases medium can take around twice as long as fast, and best around twice as long as medium. In order to match what hardware players do i could imagine this taking 10 or 20 times as long as the best setting.

Of course, there are add-in hardware cards you can get that can take over the encoding. The problem with these is that they are configured at design time to do specific tasks and can't be updated later without replacing the chips on the card or buying a new card. In today's fast pace of codec and format development such a card would be obsolete by the time you installed it. Back 15 years ago they were much more popular and several editing systems relied on them because CPUs weren't fast enough to handle the rendering in reasonable amounts of time. Those systems usually rendered to maybe 2 or 3 (hopefully) standard output formats with almost no customizable options and you were stuck with that.
VMP wrote on 12/2/2015, 9:23 AM
Interesting Chienworks!
I have always wondered why Avid, protools etc had those hardware cards for editing.
Also 3D programs used these types of cards for rendering.

Chienworks wrote on 12/2/2015, 11:30 AM
Most of those more modern ones are GPU cards running software that can be updated. Their strength lies in having 64, 128, 256, or even 512 cores. The CPU on your mother board probably only has 2 or 4 cores.