Ok, now that I got Vegas 18 able to render, I wanted to do a test for rendering speed as comparison.
Probably not very useful to anyone, but just wanted to share the result with you.
I used the same clips for both, for a total of 1 minute in timeline. These are .mp4, FHD 50 fps, from my Lumix G85 camera.
I did no edits at all, no cuts, crossfades, anything. Just added a LUT - the same of course for both - and adjusted the level at 50%.
I did the test both with Intel QSV and NVDEC as decoders, in both, but this did not seem to produce any difference.
The rendered format is Magix AVC, FHD 50fps. I used the preset Internet HD 1080p 50 fps.
The following data is from graphs and values of Windows Activity Manager and they vary +/- about 2-3% I would say.
My laptop, as by signature, has an Intel UHD 630 (card 1) and a nVidia 1050Ti (card 2).
On Vegas 17:
Playing the timeline: CPU 39%, Intel 17%, 1050Ti 6%
Rendering with NVENC: CPU 46% average (spikes at 51%), Intel 6%, 1050Ti 15% (spikes at 19%) Render time 1:54
Rendering without NVENC: CPU 88%, Intel 5%, 1050Ti 3% Render time 3:36
On Vegas 18:
Playing the timeline: CPU 33%, Intel 20%, 1050Ti 10%
Rendering with NVENC: CPU 35% average (spikes at 44%), Intel 7%, 1050Ti 16% (spikes at 19%) Render time 1:15
Rendering without NVENC: CPU 90%, Intel 5%, 1050Ti 5% Render time 3:24
So, in my little test, for rendering in the format I use mostly, for some reason, v18 seems to be more efficient, although the values of hardware utilization are not terribly different than v17. The rendering time, with NVENC, is better in v18 by about what 35%?
I would say this is a positive thing! :)
More than rendering times, I will explore more the editing experience - first and foremost the stability and smoothness. And of course, the precision of various processes like stabilization, etc.