I badly need to replace my PC displays, maybe for something like this 43" UHD monitor (or even a UHD TV). To drive it I'll need to replace my ancient AMD Radeon HD6970. In view of the NVDEC support in VP17, I'm thinking of getting an NVIDIA GPU rather than AMD, perhaps an RTX2070 or RTX2080. But NVENC rendering might be useful to me too.
So has anyone studied the quality of NVENC AVC renders in comparison to any of these alternatives?:
- CPU-only MAGIX AVC
- AMD VCE
- Intel QSV
- x264
RTX(Turing)-generation comparisons would of course be most relevant, but earlier generations also of interest. And I'll only be rendering 8-bit for the foreseeable future.
Note that the frame rate of Vegas projects needs to be matched to the NVENC-rendered file to avoid repeat frames. That is crucial when doing comparison testing on the Vegas timeline by eye or by the video scopes. The same forum thread also shows favorable results for VCE rendering, to the extent that I would probably use it over other methods, if I had recent AMD GPU.
I do realise that there are at least 2 other issues with NVENC rendering:
- NVENC rendering on some systems gives low memory error unless Dynamic RAM Preview max is set to 0. (and/or possibly a change to the maximum number of rendering threads preference).
- Lossless UHD AVC or HEVC file rendered with NVENC is zero bytes.
HOS Render Quality Metrics tool here, if anyone wants to use it for a test.