I've seen the support for kaby lake HEVC encoding hyped in the v15 release notes, so I finally upgraded my V12 to V15 but can not find the HEVC/H265 quicksync encoder option - can someone point out where it is/how to enable it?
I have those - they arent hardware accelerated, or dont appear to be hardware accelerated compared to handbrake's quicksync h265 render rate. ALso, these peg the CPU and leave the GPU at 0-1% while rendering
Former user
wrote on 11/8/2017, 6:28 PM
forum regular Peter_P claims to use HEVC hardware encoding with vegas 15
doesn't that suck. I had gotten vegas2handbrake working before I bought - guess that's the way to go for now
Former user
wrote on 11/8/2017, 7:33 PM
I was a big fan of QSV as I only used it for screen capturing using bandicam & it looked just fine however I was capturing livestreamed video that was low quality to be with, often with streaming encode speeds of 3500kbit/s or less. I capture encoded with QSV with HEVC & h264. Didn't see a problem.
But with Vegas where the source is of extremely high quality QSV H.264 is of very low qualiity compared to software encode. Obviously not HEVC with vegas but I wouldn't expect software encoding quality so why bother.
Hardware encoding I've decided is only reasonable where there is no other option such a when gaming where a CPU may be greatly loaded down. With an I7 though it's likely you could still use software encoding just fine. Video processing is a different story
Unfortunately this 'hardware encoding' promotion that vegas does is aimed at the ignorant. Ask yourself why does premiere Pro not use hardware encoding, Is it that Adobe was too stupid to work out how to do it, or did they respect their software too much to implement. There are dodgey 3rd party hacked NVENC encoders that break with upgrades that require 3rd party software that can make hardware encoding work but premiere pro deliberately don't offer hardware encoding as an option & for good reasons
I think the "Intel HEVC" codec might use integrated Intel graphics, if we have them on our CPUs, without necessarily telling us. There was some evidence of that from Cornico's posts in this thread, at least in terms of decoding. There's also a clue in the name. But I am guessing.
Anyone know if Video Pro X spells out "hardware-accelerated HEVC encoding" as a separate option from software HEVC encoding?
Former user
wrote on 11/8/2017, 9:37 PM
"intel HEVC" produces very high quality encodes. It doesn't have the low quality signature of QSV or NVENC. It also uses all my CPU & seems no faster than normal software encoding. But as you say calling it "INTEL HEVC" when HEVC is not owned by intel is strange.
Also when I use software encoding, NVENC hardware encoding or INTEL HEVC codec I use no more than 0.2watts of intel cpu graphics. When I use QSV hardware encoding via magix AVC it uses 5watts of intel graphics cpu. Maybe it's supposed to use QSV but it doesn't.
Former user
wrote on 11/9/2017, 5:19 AM
Just a guess ...
Perhaps the "intel HEVC" is an Intel codec, that can be used in software and also in hardware using for example their Kaby Lake hardware encoding/decoding chips?
If any users have such a cpu then they could check it out, for hardware implementation. If you have say a Kaby Lake cpu and have no hevc hardware encoding benefit then perhaps Magix intend to implement it within the "intel HEVC" rendering options in the future?
I did a test using a simple 6 second 4K UHD, H264 test file, I output it to a hevc 6s 4K UHD file. The "intel HEVC” render, highest quality, half bitrate to input, output file size nearly half, took 170 times the length of the 6s input file, 17m 4s. So anyone in need of that workflow, say for archival purposes, could do with hardware encoding for sure😀
Meant to say that I used a laptop with a 6700HQ chip.
Former user
wrote on 11/9/2017, 6:03 AM
I did a test using a simple 6 second 4K UHD, H264 test file, I output it to a hevc 6s 4K UHD file. The "intel HEVC” render, highest quality, half bitrate to input, output file size nearly half, took 170 times the length of the 6s input file, 17m 4s. So anyone in need of that workflow, say for archival purposes, could do with hardware encoding for sure😀
You most certainly wouldn't use hardware encoding for archival as it delivers low quality compared to software encoding. I encode with QSV intel HEVC hardware with other applications. It doesn't work in vegas 15
Newer Intel processors (Skylake and newer) have better quality now for Intel QuickSync, just make sure the quality slider is all the way to the best setting.
Providing the bit-rate is reasonable, the quality is very good and hard to tell apart from a software encoder at the same settings and considerably faster to encode. If you are going for the smallest possible file size then QuickSync just isn't designed for that and software is better. However the days of needing to compress HD down to a couple of thousand Kbs or less is long gone for the majority of us.
Where something like QuickSync excels is in reliability and speed. For example after hours and hours of encoding using x264 (before QuickSync was really a thing) I kept finding various glitches when playing back. Eventually a x264 developer picked up these issues for me, found the bugs using my sample footage and fixed it. Then a few months later an update is out for the x264 encoder and I'd come across another glitch or issue.
Now we have 4K footage to encode, software encoders are just very slow to be that practical.
I am one of the developers working on VEGAS, and would like to clarify a few things:
1) "Intel HEVC" will use the QSV hardware, if available, or else use Intel's software implementation. The QSV hardware is not a part of the Intel GPU, but a separate dedicated piece of silicon sharing the processor die. So measuring GPU activity will not let you track it's usage accurately. Assuming you have a Kaby-Lake processor, you should be able to use the QSV hardware for both 8-bit and 10-bit HEVC encoding/decoding. For Sky Lake processors, only 8-bit HEVC is supported on hardware; 10-bit is done via software (also provided by Intel currently), and is much slower. VEGAS rendering templates allow you to specify the bit-depth you want to use. Right now there is no way to turn off the hardware encoding and only use the software encoder for Intel HEVC.
2) VEGAS's support for HEVC should be on-par with what Video Pro X offers, so if you see otherwise, please let us know, or file a support ticket.
Former user
wrote on 11/13/2017, 10:45 AM
Thanks for this clarification hgala. The reason for my 6s test only running in software, even though its a skylake processor is because the manufacturer “locked out” the Intel option.
It has onboard gtx 1070 which is always on, no choice in the matter.
Acer Predator G9-793-77AC, i7-6700HQ Skylake-H.
Is there a hevc to hevc option in nvenc available? I know that the graphics card supports it in the asic.
Yes, we also have seen PCs with discrete graphics, such as yours, having issues using QSV out-of-the box; it typically requires playing around with BIOS settings to get it turned on. As this varies from system to system, it's hard to give generalized advice about how to achieve this. But in general, if you see Handbrake or other software offer the QSV option, then VEGAS should also do so (and a good way to judge that is to see if you get QSV presets via the MAGIX AVC/AAC renderer).
NVIDIA also supports hardware HEVC rendering via the NVENC API, and we plan to offer this via one of our upcoming updates for VP15 (we are also working on offering AMD support for their VCE encoder via a similar mechanism).
I have a Kaby Lake processor but get the error others have reported when trying to render 10-bit HEVC, can you confirm if you have replicated this and when the fix might arrive.
Its pretty well locked out, nothing to see in bios to change/enable and not in the Magix AVC/AAC encoder. I knew of this limitation before laptop purchase so it’s not something that i’m too bothered about, especially now that it’ll be available in NVENC API in a future update, really great news.
I mainly would export to FHD from 4K in either h264 or from 4K in hevc, always nice to have options though.
I am one of the developers working on Vegas, and would like to clarify a few things:
1) "Intel HEVC" will use the QSV hardware, if available, or else use Intel's software implementation. The QSV hardware is not a part of the Intel GPU, but a separate dedicated piece of silicon sharing the processor die. So measuring GPU activity will not let you track it's usage accurately. Assuming you have a Kaby-Lake processor, you should be able to use the QSV hardware for both 8-bit and 10-bit HEVC encoding/decoding. For Sky Lake processors, only 8-bit HEVC is supported on hardware; 10-bit is done via software (also provided by Intel currently), and is much slower. Vegas rendering templates allow you to specify the bit-depth you want to use. Right now there is no way to turn off the hardware encoding and only use the software encoder for Intel HEVC.
Just a question, you say that QSV hardware is not part of Intel GPU, is it possible to disable the GPU in the motherboard BIOS and still be able to use QSV? I ask the question since I usually tend to disable hardware I do not use in BIOS to save on energy, (less heat), and resources. Naturally I have a separate videocard...
Hgala this is great to hear. You are saying that since I have quicksync hevc via Handbrake and I have the quicksync H264 option in vegas (magix codecs), that using the Intel HEVC codec should be using the hardware renderer. I will try again - I was expecting to see a separate quicksync option show up in options like the h264 renderer - you are saying that's not the case
I just realised that also, since its either/or, you may not realise that you’re using HW not SW render, maybe theres no change to the list of render templates like there is a doubling/trebling in the Magix avc ones. If the render times are substantially quicker than say what I got for the 6 seconds clip, 170 to 1, see above, then you’re using HW render anyway!
Former user
wrote on 11/14/2017, 1:00 AM
In my case I have intel 6700, it contains the Intel QSV hardware encoding for h264 and HEVC. When I use capture software QSV HVEC is an option that appears and I use it. Checked files and they are HEVC files & I can tell by my monitoring software that 'intel CPU graphics' part of the CPU is being heavily used. Also CPU only at about 25%
With Vegas and INTEL HVEC codec, first thing I notice is there is no QSV option in the template, but there is the QSV quality selector (1-7). I render, 100% CPU, and next to no 'intel CPU graphics' load a load in watts ranging between 0.0w & 0.2w)
The term 'Intel CPU graphics' indicates any use of onboard gpu procecessing or QSV decoding or encoding on my hardware monitoring software..
There is something wrong with this codec. It's not using intel QSV. but it works fine with MAGIX AVC encoder.
Former user
wrote on 11/14/2017, 4:29 AM
Well bob-h, if you have a skylake processer, if you’re only selecting 8 bit and not 10 bit in the Intel Hevc render template, if QSV appears in the Magix avc/aac render templates, if you’re getting 100% cpu usage, then maybe take “hgala's” advice and open a ticket.
You should be able to render 10-bit HEVC on your i7-7700 system. That looks like a bug, and I have entered it in our system.
bitman:
>is it possible to disable the GPU in the motherboard BIOS
I don't think so. In fact, in order to turn on the QSV functionality on one of our systems with a separate discrete graphics card, we had to turn on something which actually referenced the Intel Graphics adapter. From that perspective, you could think of it as part of their GPU.
I ran some tests and can confirm what I think now is a bug - the Intel HEVC codec does not appear to be using quicksync eventhough it is available. I also have a kaby lake 7700k CPU with 16GB of DDR4 3600. Thanks for reporting this as a bug. Note the performance charts 1)intel hevc GPU load at 3% while for 2) vegas2handbrake using h265 quicksync has the GPU load over 80%.
A simple render test of a 12 second 1920x1080x59.94 file - no filters, all render settings default
Thanks for the detailed tests. Are you also trying to render 10-bit HEVC, or just 8-bit (the default)?
Former user
wrote on 11/14/2017, 10:55 AM
Intel Hevc test ...
I used the first 30 seconds of the “Red Car” benchmark (later full length) to get render time ratio.
Rendered in software only, on an Intel 6700 HQ CPU. (HW N/A as previously explained, see previous post)
This gives a 4.73 to 1 ratio. (Also tested Haswell desktop 4790K ..... This gives a 3.83 to 1 ratio. Software hevc option only available for this chip.)
I rendered the first 30 seconds of project, output to FHD 29.97 fps, default quality of 4, Data rate 12 Mbps (24/12), 8 bit.
I also did the same project, full length, rendered to 10 bit, ratio was 5.41 to 1 on laptop in software.
This benchmark is available for others to test, anyone with hardware support for intel hevc should get a lot better than this.