I came across this in another thread while establishing that decoding does work for me using Nvidia. I found that Intel QSV decoding wasn't working on my PC, specs in profile. No harm in highlighting the issue in a new thread in case anyone else comes across it. So for me not a big issue as Nvidia decoding would very likely be better than Intel, if it was working. Below is a mainly copy and paste from the other thread. I have the very latest Nvidia studio drivers and Intel drivers installed.
There was a 0% render time change using Intel QSV decoding in VP17 vs simply using VP16. I suspect that the "Intel Graphics 630" is not being used for decoding. Note that unchecking QSV in "General" simply makes it unavailable in the render templates for testing.
I have confirmed that the Intel QSV HW encoding is working using the Red Car test with a gain of 5s and this 57s test with a gain of 3s.
I can also confirm that while observing the Intel QSV gpu in Task Manager it never shows any decoding activity.
SECTION 1
19s duration clip length x 3 = test project duration of 57s, UHD 25fps.
HW Acc. = Nvidia for all. I used Nvenc render template.
VP17 .. UHD to FHD using Nvidia in file I/O, render time = 15s ..... 21% faster render.
VP17 .. UHD to FHD using Intel QSV in file I/O, render time = 19s
VP17 .. UHD to FHD using no file I/O decoding, render time = 19s
VP16 .. UHD to FHD, using Nvenc render template, render time = 19s
SECTION 2
I tried something else with respect to the non-decoding of Intel QSV on my system. I set HW Acc to Intel QSV instead of Nvidia. Maybe decoding needs HW ACC. to have Intel selected?
I then rendered out the same test project with Intel QSV selected in the File I/O tab, and tested with HW Decoding on or off. The result is no change, HW decoding using Intel QSV is not working on my PC, but QSV HW encoding is.
19s duration clip length x 3 = test project duration of 57s, UHD 25fps.
HW Acc. = Intel QSV. I used Nvenc render template.
File I/O = Intel QSV
VP17 .. UHD to FHD, "Enable Hardware Decoding" in file I/O = ON, render time = 47s.
VP17 .. UHD to FHD, "Enable Hardware Decoding" in file I/O = OFF render time = 47s
I also did a test using an Intel QSV render template (HW Acc = Nvidia) and tested with File I/O = Intel QSV.
With File I/O Enable hardware encoding = Off, this gave a 44s render time
With File I/O Enable hardware encoding = On, this gave a 44s render time