k, testing out some 4k footage from the new BlackMagic 4k camera....
Vegas defaults to 8-bit.... and that is fine... my monitor is set to display 16-235 (generally speaking) for editing mode.
the 4k footage is shot "FLAT" as expected to color grade it, and seems to already be limited tween 16-235.
When I pop Vegas into 32bit mode, very little changes (like normal), but when I jump to 32 bit FULL VIDEO mode, it becomes full contrast and colorful, almost graded and fine.
Question is, should I START like that with this new/better footage, or stay in 8 bit for grading?
This footage is already 4:2:2 or something better than what I am used to.
And I realize it is easier to stay in 8 bit until RENDERING, but if I Grade in 8-bit, then punch up to 32 FULL, it will be way over the top contrast and out of bounds etc.
So, I guess I am confused.
Best way to grade / render 10-bit, 4k video?
any tips on making preview faster would be nice. I've got it tweaked but 4k only gets 13fps consistantly, like there is a switch I can fix somewhere
Vegas defaults to 8-bit.... and that is fine... my monitor is set to display 16-235 (generally speaking) for editing mode.
the 4k footage is shot "FLAT" as expected to color grade it, and seems to already be limited tween 16-235.
When I pop Vegas into 32bit mode, very little changes (like normal), but when I jump to 32 bit FULL VIDEO mode, it becomes full contrast and colorful, almost graded and fine.
Question is, should I START like that with this new/better footage, or stay in 8 bit for grading?
This footage is already 4:2:2 or something better than what I am used to.
And I realize it is easier to stay in 8 bit until RENDERING, but if I Grade in 8-bit, then punch up to 32 FULL, it will be way over the top contrast and out of bounds etc.
So, I guess I am confused.
Best way to grade / render 10-bit, 4k video?
any tips on making preview faster would be nice. I've got it tweaked but 4k only gets 13fps consistantly, like there is a switch I can fix somewhere