I've seen the support for kaby lake HEVC encoding hyped in the v15 release notes, so I finally upgraded my V12 to V15 but can not find the HEVC/H265 quicksync encoder option - can someone point out where it is/how to enable it?
There is no secret, it worked right from the beginning any time I use it and I could improve render time and timeline performance by adding an old AMD R7 250 graphic card.
What intel graphics driver are you using Peter_P?
Former user
wrote on 11/16/2017, 4:18 AM
Cornico:
This small extract from one of Nicks FAQ's apppears to bear that out ... i.e. Sony & MC only ...
The two AVC/AAC encoders available in VEGAS allow GPU-accelerated rendering, which is a different thing from GPU acceleration of video processing explained above in part 1.
The term "legacy" is used because the code was optimized for the GPUs that were available when it was written, a number of years ago. The code is not optimized for currently-available graphics cards.
There is no secret, it worked right from the beginning any time I use it and I could improve render time and timeline performance by adding an old AMD R7 250 graphic card.
What intel graphics driver are you using Peter_P?
The properties for the HD530 in the device manager shows : 21.20.16.4542
I had some standby problems with newer drivers ~ Jul.2017 and reinstalled the older driver, that I already used a while.
I did a render-test that you can check on your system. Download the original XAVC-S footage from this RX10 M4 Clip and render it in Vp15 B216 to UHDp24 Intel HEVC 40Mbps default template. This 12;12 long clip is rendered in 13sec on my i7-6700k - very close to realtime. This can not be reached by HEVC CPU rendering.
Former user
wrote on 11/16/2017, 7:51 AM
I've updated my Red Car hevc render time ratios with 10 bit also, page 2 of this thread. In software it’ll usually be several multiples of project length, whereas in hardware it can be closer to real time as Peter_P demonstrates.
So I believe I found my problem. I have both Intel and Nvidia GPUs and was running the monitor off the Nvidia GPU. I switched the monitor to the Intel GPU and now have hardware rendering via the Intel HEVC codec - and I also get an error when I try 10 bit HEVC encoding. Hopefully that narrows the bug down for the Vegas folks
Former user
wrote on 11/16/2017, 8:06 AM
Cornico:
“I love the new possibilities” Same here, and eventually having hevc encoding in HW for NVidia and I assume AMD. I have an old GTX 580 but don’t have it installed as the GTX 1080 will eventually cover all h264 & hevc encoding. I know though that Oldsmoke gets very good speed from using the Legacy cards, so room for both.
Former user
wrote on 11/16/2017, 8:12 AM
So I believe I found my problem. I have both Intel and Nvidia GPUs and was running the monitor off the Nvidia GPU. I switched the monitor to the Intel GPU and now have hardware rendering via the Intel HEVC codec - and I also get an error when I try 10 bit HEVC encoding. Hopefully that narrows the bug down for the Vegas folks
I take it that you are still getting error free 10 bit encoding in software?
I did a render-test that you can check on your system. Download the original XAVC-S footage from this RX10 M4 Clip and render it in Vp15 B216 to UHDp24 Intel HEVC 40Mbps default template. This 12;12 long clip is rendered in 13sec on my i7-6700k - very close to realtime. This can not be reached by HEVC CPU rendering.
Here the same result on desktop from signature
Former user
wrote on 11/16/2017, 9:01 AM
yes it was error-free CPU encoding, just really slow!
Well, that should really narrow it down, over to Magix now, unless someone else with say a Kaby Lake CPU, can encode 10bit in HW error free.
Former user
wrote on 11/16/2017, 9:04 AM
There is no secret, it worked right from the beginning any time I use it and I could improve render time and timeline performance by adding an old AMD R7 250 graphic card.
What intel graphics driver are you using Peter_P?
The properties for the HD530 in the device manager shows : 21.20.16.4542
Didn't work. Removed those old drivers as when cancelling out of encode when it was obvious no hardware encoding was happening vegas crashed which doesn't normally happen so those drivers made vegas more unstable and they didn't help.
I believe at this point i will give up on hevc hardware encoding with vegas
I will chime in on the Intel 6700k doing HEVC encoding. Once I figured out the 7700k I did the same thing for my second system which is an Intel 6700k with an Nvidia gtx680. Once I made the Intel GPU the only active display adapter I was able to HW encode with the Intel HEVC codec
I've updated my Red Car hevc render time ratios with 10 bit also, page 2 of this thread. In software it’ll usually be several multiples of project length, whereas in hardware it can be closer to real time as Peter_P demonstrates.
@Former user Which thread? There is no link in your post. Are you saying that you now have QSV-speed 10-bit HEVC rendering?
---
ken-dehoff: ...I also get an error when I try 10 bit HEVC encoding. JN_: I take it that you are still getting error free 10 bit encoding in software? ken-dehoff: yes it was error-free CPU encoding, just really slow!
@ken-dehoff So can you encode 10-bit HEVC or not (on your i7-7700k system)?
---
@Peter_P@Former user What is your monitor setup? Which device are your monitor(s) plugged into? What OS version?
---
This is the most confusing thread I've ever read on this forum. Please, if anyone has a report of success/failure/fast/slow, spell out all the details so we can try and establish a pattern:
Operating System and exact version (e.g. Windows 10 version 1703)
Vegas version
CPU model
Intel graphics driver version
Other GPU model
GPU driver version
Which device your monitor(s) is plugged into
Status of "GPU acceleration of video processing"
HEVC settings, especially 8-bit or 10-bit
Screen shots of Intel graphics performance if you can
Also, referring to a system in a signature or profile without stating the spec can get really confusing later, when signatures change. That's already caught me out at least once.
My LG UHD Monitor is connected to the mainboard intel graphic running Win10 pro 1703 using an i7-6700k with AMD R7 250 . Vp13 B453, Vp14 B270, Vp15 B216. The properties for the HD530 driver in the device manager shows : 21.20.16.4542.
Vegas preference Video : GPU acceleration AMD (capverde)
Intel HEVC Setting :
Working fine with 8bit. Error message on start with 10-bit setting.
I assume, that the Intel encoder detects that QSV is available and does not differentiate between the i7-6700k and the i7-7700k that is able to process the HEVC rendering with GPU support. If there is no QSV at all, the 10-bit option is working fine.
Thanks Peter. I added those details to the spreadsheet.
Still plenty more blanks for fill in to get this issue nailed down.
Former user
wrote on 11/17/2017, 6:17 AM
Hi neck ... its the 3rd. post on page 2 of this thread. I’m copying it here again ...
Part of the confusion is everyone is using different methologies for testing, and until hgla contributed to this thread there was a lot unknown. So the thread itself was part learning curve and part testing. Example .. if you manage to enable HW render then SW render is no longer available, with the same set of templates, if true its non intutive compared to the way h264 render templates are seperated out.
I used the original Red Car test because its readily available, and adds some consistency. All of my tests are software only.
For all future testing, and there will be plenty when VCE is available, it would be in everyones interest to settle on just one test sample, new or old.
“Intel Hevc test ...
I used the first 30 seconds of the “Red Car” benchmark (later full length) to get render time ratio.
Rendered in software only, on an Intel 6700 HQ CPU. (HW N/A as previously explained, see previous post)
This gives a 4.73 to 1 ratio. (Also tested Haswell desktop 4790K ..... This gives a 3.83 to 1 ratio. Software hevc option only available for this chip.)
I rendered the first 30 seconds of project, output to FHD 29.97 fps, default quality of 4, Data rate 12 Mbps (24/12), 8 bit.
I also did the same project, full length, rendered to 10 bit, ratio was 5.41 to 1 on laptop in software.
This benchmark is available for others to test, anyone with hardware support for intel hevc should get a lot better than this.
Last changed by JN_ on 11/16/2017, 1:46 PM, changed a total of 3 times.“
Former user
wrote on 11/17/2017, 7:00 AM
To summarize my tests ...
Intel Hevc test ...
I used the “Red Car” benchmark. Original.
I rendered output to FHD 29.97 fps, default quality of 4, Data rate 12 Mbps (24/12), 8 bit and 10 bit.
Rendered in software only, on an Intel 6700 HQ and 4790K, (HW render n/a)
Time taken ...
6700HQ .. 4.73 times real time. 8 bit.
6700HQ .. 5.41 times real time. 10 bit.
4790K ..... 3.83 times real time. 8 bit.
4790K ... 4.38 times real time. 10 bit.
This benchmark is available for others to test, anyone with hardware support for intel hevc should get a lot better than this.
———————————————————————————————————————————
Laptop ...
Win 10 Home ver. 1703, build 15063.674
Vegas Pro 15 (216)
Cpu 6700HQ
Intel graphics driver version .. N/A
GTX 1070 .. driver ver.
Only laptop screen in use
Status of "GPU acceleration of video processing” = Nvidia
PC ...
Win 10 Home ver. 1703, build 15063.674
Vegas Pro 15 (216)
Cpu 4790K
GPU MSI GTX 1080 z
Intel graphics driver version ..
Nvidia graphics driver version ..
Only 1 monitor in use and connected to GTX 1080
Status of "GPU acceleration of video processing” = Nvidia
Former user
wrote on 11/17/2017, 7:46 AM
When I completed the 10 bit render on the PC and on exit Vegas bombed out, file creation was ok though. Redid the 10 bit file render, creation ok, Vegas exit this time without error.
@Former user Thanks for the info, which I've added to the spreadsheet. What Nvidia GPU do you have in your desktop, and is your single monitor plugged into it?
Former user
wrote on 11/17/2017, 12:26 PM
MSI GTX 1080 Z. Single monitor is plugged into this GPU, not the mainboard.
With this setup I still can use Magix avc/aac Intel QSV HW render if I choose.
Just a thought. I wonder, for other users that need the monitor initially connected to mainboard intel graphics, if after its detected and in use, with Intel drivers installed now on the system, if the monitor can be reconnected back to the main gpu and still have hevc HW render capability?
I did a render-test that you can check on your system. Download the original XAVC-S footage from this RX10 M4 Clip and render it in Vp15 B216 to UHDp24 Intel HEVC 40Mbps default template. This 12;12 long clip is rendered in 13sec on my i7-6700k - very close to realtime. This can not be reached by HEVC CPU rendering.
Here the same result on desktop from signature
There must be something wrong with my new PC regarding QSV and Intel HVEC, with INTEL i7 8700K (UHD 630) I should at least get around 13 s render time with the above clip and settings, but it takes me 1 minute and 36 s. It seems like a pure sw render. Or the Intel Hvec is performing worse on coffee lake UHD 630 (or not properly detected?)
Mind you this with newer Vegas 15 build 261... and the latest intel drivers (November).
I must also say that my monitor is connected to the nvidea card. Maybe I should connect to the Mobo for a proper detect once and see if the render speed improves...
...I must also say that my monitor is connected to the nvidea card. Maybe I should connect to the Mobo for a proper detect once and see if the render speed improves...
If you can make any sense of this thread, that seems to be the conclusion. If you check my spreadsheet built from the info in this thread (last updated pre-build 261), those who get QSV speeds with 8-bit HEVC have all got their monitors plugged into the motherboard, or have Intel as "the only active display adapter", whatever that means.
I must also say that my monitor is connected to the nvidea card. Maybe I should connect to the Mobo for a proper detect once and see if the render speed improves...
Definitely! I some days ago disabled QSV intentionaly on my system by connecting the main UHD monitor to my AMD card. In this case I could render to Intel HEVC 10-bit only CPU based, which currently is not possible, with the Intel GPU enbled.