Based on these render times there isn't working QSV.
I'd still head over to NVIDIA website and download the Studio driver as Vegas update doesn't know about your card yet.
I don't think Grazie's CPU has QSV... For Vegas to recognize BOTH my PCIe GPU and onboard iGPU (Intel 630, etc.) I have to install BOTH the AMD/Nvidia drivers AND the Intel 630 drivers. Then, I also have to plug in a monitor on one of the iGPU ports. Afterwards, Vegas shows me the option to use MainConcept (CPU), VCE or NVENC depending on GPU, or QSV rendering...
NOTE - some motherboards & especially laptops do NOT allow having both PCIe GPU and iGPU operating at once... My Evoo 17 gaming laptop allows both GPUs to assist Vegas whereas my son's HP Omen laptop does not... So even though they have the SAME 6-core CPU & similar onboard gaming GPUs, mine renders faster since QSV kicks-in...
Based on my own tests, the 1080p VCE renders of my VEGA 64 LQ are faster than NVENC renders of a new Nvidia 3060 Ti... So, on the original Red Car Test, my 4.8 ghz 9900K & VEGA 64 LQ gets 15 seconds compared to 19 seconds with the Nvidia 3060 Ti - both GPUs at stock settings. If I bump up my CPU to 5 ghz and under-volt/overclock my VEGA 64 LQ, I get 13-14 seconds and have even had it drop to 12 seconds if I manually disable background tasks & prioritize Vegas....
When I ran the 4K to FHD test, they were about the same, but when I ran the 4K to UHD test, the 3060 Ti was faster by 18% 1:20 vs. the VEGA's 1:38. So, IMO, the newer cards are faster at 4K. This will only get better as apps increase their support. For instance, after an update, Adobe Premiere Pro gets a bigger boost from the new 3xxx GPUs and I expect Vegas to follow. Neat Video, etc. also benefit from the new GPUs as Grazie has reported...
Since I always build / upgrade my own workstations, it only takes a weekend to upgrade to something better, and I typically do not consider upgrading unless I get a big-enough boost in performance to justify the cost & my time. Like many, I ran my overclocked 6-core Xeons for at least 8 years because I wasn't going to tear-apart everything for a mere 20% increase in performance. Rather, I just maintained multiple workstations so that one was rendering why I was editing on the fastest one... For paid work, having more than one workstation is always better than having the latest/greatest but then not being able to work if it goes down...
I don't think Grazie's CPU has QSV... For Vegas to recognize BOTH my PCIe GPU and onboard iGPU (Intel 630, etc.) I have to install BOTH the AMD/Nvidia drivers AND the Intel 630 drivers. Then, I also have to plug in a monitor on one of the iGPU ports.
Crazy! I knew about the 10900KF not having the igpu, but didn't know the 10900x also didn't have one. I also didn't know modern intel CPU's still required a monitor attached, I suspect that's another Vegas thing and not true for modern software. The monitor you connect to doesn't need to be turned on, infact I was using a broken one, and also you can get dongles that connect to the port that simulate a monitor connected. Probably just a few resisters but they're cheap enough you may as well just buy the dummy monitor plug
There’s no option for a Studio Driver. Interesting, huh 🤔.
There is a Studio Driver
@GerY - Look back on this thread and you’ll see I since had discovered this, and identified both. They have the same number. I recommend you do this as I went to some lengths to cover this. Thank you.
I don't think Grazie's CPU has QSV... For Vegas to recognize BOTH my PCIe GPU and onboard iGPU (Intel 630, etc.) I have to install BOTH the AMD/Nvidia drivers AND the Intel 630 drivers. Then, I also have to plug in a monitor on one of the iGPU ports.
Crazy! I knew about the 10900KF not having the igpu, but didn't know the 10900x also didn't have one. I also didn't know modern intel CPU's still required a monitor attached, I suspect that's another Vegas thing and not true for modern software. The monitor you connect to doesn't need to be turned on, infact I was using a broken one, and also you can get dongles that connect to the port that simulate a monitor connected. Probably just a few resisters but they're cheap enough you may as well just buy the dummy monitor plug
Strange, I have a i910900K, no monitor connected to iGPUports and have full benefit of the QSV renderoptions. Vpro18 recognizes both GPU's as shown in my screenshot earlier.
Strange, I have a i910900K, no monitor connected to iGPUports and have full benefit of the QSV renderoptions. Vpro18 recognizes both GPU's as shown in my screenshot earlier.
@TheRhino can you confirm you need the monitor connected?
It was always my understanding that more modern CPU's no longer had the requirement of monitor connected. The only thing I can confirm is intel 6 series CPU's did require a monitor plugged into motherboard for quicksync decode/encode to work, but I did not think this was a modern thing
Look, Gentlemen, there’s so much WE, collectively just don’t know. I’ve done my best in responding with my detailed accounts of MY state of play. I’m happy to know what further research on these weighty matters. Is there anything more you Guys need from me? I get that I’ve been closely “question” by the Group, may it remain so.
Be that as it may, I’ve got a top of the range the Range RTX3080, no I ain’t gonna spend over a Grand on the 3090, for the discounted warranty PLUS additional discount that could be for a lower specc’ed GPU. I’m happy 😆.
Strange, I have a i910900K, no monitor connected to iGPUports and have full benefit of the QSV renderoptions. Vpro18 recognizes both GPU's as shown in my screenshot earlier.
@TheRhino can you confirm you need the monitor connected?
It was always my understanding that more modern CPU's no longer had the requirement of monitor connected. The only thing I can confirm is intel 6 series CPU's did require a monitor plugged into motherboard for quicksync decode/encode to work, but I did not think this was a modern thing
@lenard@3POINT When I built this system, all tests showed that I needed to have a monitor attached to the motherboard iGPU for best performance... But since then Windows 10 has updated, drivers, etc. have been updated...
Today, if I boot-up my (9900K/Asus Z390 WS/VEGA 64 LQ) system without a monitor connected to the motherboard's iGPU, I still lose the ability to choose QSV as one of the "encode mode" choices within render templates for V16 - V18. This is true even if I power-on the 3rd monitor later, Windows 10 recognizes it, and then start-up Vegas... So Vegas still does not give me QSV as an option unless I reboot with a monitor attached to the motherboard... Note that in BIOS I have my iGPU vs. GPU choice set to "auto"...
After booting without the iGPU monitor enabled, V18 still shows increased Intel 630 iGPU usage in Task Manager during VCE rendering while V16 & V17 do not. Interestingly, my RedCar test and SampleProject 4K tests give about the same results even when Task Manager does not show V16 & V17 Intel 630 usage during UHD & FHD VCE renders... If uninstall my Intel 630 drivers completely, my render speeds really slow, so IMO the iGPU is being used in Vegas 16 - 18 even though it is not being reported accurately in Task Manager, etc...
@Grazie "Look, Gentlemen, there’s so much WE, collectively just don’t know. I’ve done my best in responding with my detailed accounts of MY state of play. I’m happy to know what further research on these weighty matters. Is there anything more you Guys need from me?"
Well, if you are finished here, perhaps consider entering your results in the Benchmarking Continued thread.
In the Benchmarking Continued thread don't try QSV, as the CPU you have doesn't contain an iGPU. This was already established quite some time back in this thread by diverG, J-V, and I have also checked it out. You are chasing something that simply doesn't exist as an option for you now. Do the usual Magix Avc CPU and Nvenc renders for FHD and UHD, this will give a total of 4 entries.
The reason QSV appears as an option when you don't have the hardware is down to Magix, they may for some reason that we are unaware of be unable to remove it's appearance even when the hardware is not there.
FHD is what you previously entered, Full HD, 1920 x 1080 render. UHD is a 16:9 version of DCI 4K, 3840 x 2160. I used these acronyms for brevity, because, looking at your previous conversation with 3point, I assumed that you already knew what they meant ... 3point ...“I see, your cameras are FHD. For FHD you probably will not see a difference between a speedy GPU render or slower CPU render. My cameras are UHD and a down scale FHD renders gives a big difference in quality between a fast GPU render and a slower CPU render. The last gives a FHD quality that I hardly can distinguish from the original UHD on my UHDTV. Grazie ... Now, where can I purchase a UHD Cammi”
Clear instructions are available for everything in my first post in that thread. Simply click on the benchmarking link in my signature.
This benchmarking project supersedes the older Red Car test, and has nothing to do with it. Your wasting your time testing the original Red Car project as your modern PC will saturate the playback rate, its good only for render tests. Its also useful to test playback rates for lower specced PC's.
For example, if you tested your current PC with the Red Car project it would demonstrate a full FPS playback rate of 29.97. If you tested your older, but still modern PC with the Red Car test, it would probably still give 29.97fps. So useless for playback rates comparisons. However, the Sample Project 4K Benchmarking project in my signature hasn’t yet had a full playback rate of 25fps by any of the users machines yet, the Max so far is 18.5fps, so very useful.
PHEW . . OK . . . I needed to SAVE as a VP18 Project, it was still in VP11 mode . . . Then I got busy, but still with a range only from 12fps to 17fps .
@Grazie No need to save as VP18, that way you can always use older VP versions to test if you wish.
Don’t make any changes to the project properties, or anything else. Use the render templates provided in the screen shots. One for FHD and one for UHD. You did all this before, so shouldn’t be a problem for you.
Please read carefully the complete screen grab with instructions on how to do the Average FPS test for Region 1 only.
Render tests are for the complete project length, not just Region 1.
Today, if I boot-up my (9900K/Asus Z390 WS/VEGA 64 LQ) system without a monitor connected to the motherboard's iGPU, I still lose the ability to choose QSV as one of the "encode mode" choices within render templates for V16 - V18. This is true even if I power-on the 3rd monitor later, Windows 10 recognizes it, and then start-up Vegas... So Vegas still does not give me QSV as an option unless I reboot with a monitor attached to the motherboard... Note that in BIOS I have my iGPU vs. GPU choice set to "auto"..
You could try the following to simulate a monitor for windows. It's an old article, not win10 and the intel gpu shown was released in 2013 but guide is still linked via OBS Help forum to enable quicksync
@Grazie " I needed to SAVE as a VP18 Project, it was still inVP11mode "
The Sample Project 4K was created probably around 2019. It loads in VP16 and later.
The Red Car project was created in the autumn of 2011.
So I take it you are still testing the Red Car project?
If it's too difficult a task for you to do the Benchmarking Project 4K, that's ok, I understand, we can pass on it.
Can you remember if you inadvertently used the Red Car project also in 2019, in place of the Sample Project 4K? If so I can remove your results as they wouldn't be relevant. Thank you.
No, MagicAVC can be rendered in UHD and HD with Mainconcept and NVENC. So 4 in total.
All MagixAVC:
UHD (4K) NVENC UHD (4K) Mainconcept
FHD (1080p) NVENC FHD (1080p) Mainconcept
p. 1 of the benchmarking thread has screenshots of the exact render settings you can follow. "The data rates of the 2 templates is 24/12 for FHD and 50/28 for UHD"