The Pro cards are available with more ram but I don't think Vegas benefits tremendously from that. Been hearing they're more stable with multi-monitor setups. Seem to be in stock at both B&H and Newegg.
I saw the Techgage review a couple days ago & was trying to get some AMD 6800 XT's over the Black Friday / Cyber Monday weekend, but without success... IMO they are probably the best bang/buck for Vegas if you can find them at standard retail... Best Buy is one of the few retailers not charging over retail and they have a couple for $650 US... They use a manual CAPTCHA login process to prohibit bots, but you still have to place your order fast before the scalpers get them. I got a 3060 TI earlier this year from Best Buy just by checking their website every noon ET. My plan is to resell my aging VEGA 56 & 64 while prices are high. Miners are still paying big bucks for VEGAs because, although more power-hungry, they have some good mining numbers...
Vegas using some flavor of OpenCL? Odd that it would skew so hard towards AMD otherwise?
Former user
wrote on 11/30/2021, 8:00 PM
Who wants to analyze what's going on here?
Why the 60-80 second difference in encoding time between the 6000 series AMD's and the 3000 series Nvidias?
Due to the 3060ti and 3090 having basically the same encode speed, we know very little GPU is being used, nor would you expect that with a basic transcode. It's a similar thing with the 6000 series AMD's with a bit of strangeness in the figures but they're roughly similar.
Is it that the AMD hardware encoder is faster or is it that Vegas is doing some terribly inefficient translating to use CUDA on Nvidia GPU? If any of you guys have the AMD cards shown here would you know what your maximum GPU encode speed is at HEVC 4K?
My Nvidia will encode a 1 minute of 4K59.4fps AVC to 4K59.4fps HEVC in about 41 seconds using Shutter Encoder NVENC. If the AMD encoders are not faster then I guess it's latency that Vegas is introducing when using Nvidia GPU's?
Comparison 1minute of 4K AVC59.4 fps to 4K HEVC 59.4fps (hardware encoding) using Nvidia rtx3070
Shutter encoder 41seconds
Davinci Resolve 60 seconds
VP 18 1m11s (voukoder)
VP 18 1m 55s (MagixHevc)
VP19 1m 15s (voukoder)
VP19 1m 55s (MagixHevc)
If anyone with an AMD card shown on the list wants to do any comparisons. I would be especially interested to see if AMD people have the huge time difference between MagixHevc and Voukoder
Former user
wrote on 11/30/2021, 8:44 PM
Vegas using some flavor of OpenCL? Odd that it would skew so hard towards AMD otherwise?
Unless things have changed Vegas is optimized for OpenCL 2.0 Framework, that's a standard from 2013. Maybe the newer Nvidia GPU's that work so well with OpenCL need a newer version of OpenCL to work efficiently and Vegas is too outdated, which is giving modern AMD GPU's a big lead
This shows Nvidia doing very well with modern OpenCL software like Blender @zaubermac
Nvidia has worked to include better OpenCL support in it's drivers, so as not to get spanked too hard in these discussions! Adobe products produced more balanced results, I'm guessing as they have resources to dev both CUDA and OpenCL engines. I'm quite fine, though, not paying a bijillion dollars a year to rent software, in exchange for slightly slower benchmarks on my Nvidia card. 😀
That was a great article, btw. I've been out of the loop on Vegas last couple years. Nice to see it still gets some love past the users' keyboards!
Why the 60-80 second difference in encoding time between the 6000 series AMD's and the 3000 series Nvidias?
I agree that this is odd behavior, considering NVIDIA's encoder has been revered for a while, but the same thing is seen in Premiere Pro's AVC > HEVC test:
I believe VEGAS uses CUDA for NVIDIA, not OpenCL, but I could be wrong. It is using NVENC, in any case. I gave Voukoder a quick test, but it isn't working properly for AMD. Here are the outputs:
I might be overlooking something, but I can't see in Voukoder where you'd set the bitrate. Using the default settings, the AMD bitrate is atrociously low, whereas NVIDIA is not. But... the AMD dialog box doesn't even have a preset available like the NVIDIA one does:
The encode time for Voukoder with AMD was essentially identical to MAGIX (but again, the output was junk). Here's the NVIDIA result:
NVIDIA GeForce RTX 3070 MAGIX AVC: 143 s MAGIX HEVC: 181 s Voukoder AVC: 97 s Voukoder HEVC: 97 s
Former user
wrote on 12/1/2021, 6:18 PM
@Deathspawner Thanks for trying Voukoder and those results. Nvidia starting to be left in the dust as far as encoding technology, Amd6000 series faster HEVC encoder. Apple M1 can do 140fps 4K HEVC, and the UHD 770 IGPU on I12 cpu's has the HEVC 422 10 bit decoder, but also has an upgraded encoding engine but not sure how it can be realised in applications as it looks to have a higher parallel encoding capacity rather then what the M1 did with increase speed of encode
Do you recall if your 4K HEVC HARDWARE encoding when using Vegas with the 6000 series had the same sawtooth pattern as in this picture. It most likely will be sawtooth but I am thinking more condensed , closer together. It would be hard to compare, On taskmanager I have VIEW/UPDATE SPEED set to High
Former user
wrote on 12/1/2021, 8:48 PM
Hi, i'm just trying this, i have a 4k 1min clip, it says Frame rate mode : Variable.
General Complete name : D:\desk stuff\Nvidia screen capture\Desktop\1min avc 59.94.mp4 Format : MPEG-4 Format profile : Base Media / Version 2 Codec ID : mp42 (isom/mp42) File size : 288 MiB Duration : 1 min 0 s Overall bit rate mode : Variable Overall bit rate : 40.2 Mb/s Encoded date : UTC 2021-12-02 02:27:40 Tagged date : UTC 2021-12-02 02:27:40
Video ID : 1 Format : AVC Format/Info : Advanced Video Codec Format profile : High@L5.2 Format settings : CABAC / 3 Ref Frames Format settings, CABAC : Yes Format settings, Reference frames : 3 frames Format settings, GOP : M=1, N=30 Codec ID : avc1 Codec ID/Info : Advanced Video Coding Duration : 1 min 0 s Bit rate : 40.0 Mb/s Width : 3 840 pixels Height : 2 160 pixels Display aspect ratio : 16:9 Frame rate mode : Variable Frame rate : 59.940 (59940/1000) FPS Minimum frame rate : 59.940 FPS Maximum frame rate : 60.000 FPS Original frame rate : 59.940 (60000/1001) FPS Color space : YUV Chroma subsampling : 4:2:0 Bit depth : 8 bits Scan type : Progressive Bits/(Pixel*Frame) : 0.081 Stream size : 286 MiB (100%) Language : English Encoded date : UTC 2021-12-02 02:27:42 Tagged date : UTC 2021-12-02 02:27:42 Color range : Limited Codec configuration box : avcC
Audio ID : 2 Format : AAC LC Format/Info : Advanced Audio Codec Low Complexity Codec ID : mp4a-40-2 Duration : 59 s 989 ms Bit rate mode : Variable Bit rate : 192 kb/s Maximum bit rate : 333 kb/s Channel(s) : 2 channels Channel layout : L R Sampling rate : 48.0 kHz Frame rate : 46.875 FPS (1024 SPF) Compression mode : Lossy Stream size : 1.35 MiB (0%) Language : English Encoded date : UTC 2021-12-02 02:27:42 Tagged date : UTC 2021-12-02 02:27:42
Former user
wrote on 12/1/2021, 8:54 PM
I wanted to try it at a constant frame rate, but i don't think i'm doing it right because it still says variable?
I don't ever have to use this Handbrake thingy
General Complete name : D:\desk stuff\Nvidia screen capture\Desktop\1Min Avc 59.94-1.mp4 Format : MPEG-4 Format profile : Base Media / Version 2 Codec ID : mp42 (mp42/iso2/avc1/mp41) File size : 145 MiB Duration : 1 min 0 s Overall bit rate : 20.3 Mb/s Encoded date : UTC 2021-12-02 02:50:14 Tagged date : UTC 2021-12-02 02:50:14 Writing application : HandBrake 1.4.2 2021100300
Video ID : 1 Format : AVC Format/Info : Advanced Video Codec Format profile : Main@L4 Format settings : CABAC / 1 Ref Frames Format settings, CABAC : Yes Format settings, Reference frames : 1 frame Codec ID : avc1 Codec ID/Info : Advanced Video Coding Duration : 1 min 0 s Bit rate : 20.1 Mb/s Width : 3 840 pixels Height : 2 160 pixels Display aspect ratio : 16:9 Frame rate mode : Variable Frame rate : 59.940 (60000/1001) FPS Minimum frame rate : 59.920 FPS Maximum frame rate : 59.960 FPS Color space : YUV Chroma subsampling : 4:2:0 Bit depth : 8 bits Scan type : Progressive Bits/(Pixel*Frame) : 0.040 Stream size : 144 MiB (99%) Writing library : x264 core 163 r3059 b684ebe0 Encoding settings : cabac=1 / ref=1 / deblock=1:0:0 / analyse=0x1:0x111 / me=hex / subme=6 / psy=1 / psy_rd=1.00:0.00 / mixed_ref=0 / me_range=16 / chroma_me=1 / trellis=1 / 8x8dct=0 / cqm=0 / deadzone=21,11 / fast_pskip=1 / chroma_qp_offset=-2 / threads=67 / lookahead_threads=11 / sliced_threads=0 / nr=0 / decimate=1 / interlaced=0 / bluray_compat=0 / constrained_intra=0 / bframes=0 / weightp=1 / keyint=600 / keyint_min=60 / scenecut=40 / intra_refresh=0 / rc_lookahead=30 / rc=crf / mbtree=1 / crf=22.0 / qcomp=0.60 / qpmin=0 / qpmax=69 / qpstep=4 / vbv_maxrate=20000 / vbv_bufsize=25000 / crf_max=0.0 / nal_hrd=none / filler=0 / ip_ratio=1.40 / aq=1:1.00 Encoded date : UTC 2021-12-02 02:50:14 Tagged date : UTC 2021-12-02 02:50:14 Color range : Limited Color primaries : BT.709 Transfer characteristics : BT.709 Matrix coefficients : BT.709 Codec configuration box : avcC
Audio ID : 2 Format : AAC LC Format/Info : Advanced Audio Codec Low Complexity Codec ID : mp4a-40-2 Duration : 59 s 990 ms Source duration : 1 min 0 s Bit rate mode : Constant Bit rate : 161 kb/s Channel(s) : 2 channels Channel layout : L R Sampling rate : 48.0 kHz Frame rate : 46.875 FPS (1024 SPF) Compression mode : Lossy Stream size : 1.15 MiB (1%) Source stream size : 1.15 MiB (1%) Title : Stereo Language : English Default : Yes Alternate group : 1 Encoded date : UTC 2021-12-02 02:50:14 Tagged date : UTC 2021-12-02 02:50:14 mdhd_Duration : 59989
Former user
wrote on 12/1/2021, 9:28 PM
The 1m AVC CLIP I rendered from Vegas at the default settings for MAGIX AVC/AAC MAGIX
HEVC default settings 1.30mins render time.
MEP PREMIUM just out of curiosity, i can change any settings if you think
just over 0.51secs, (the export box goes off so prob about 0.53secs)
Do you recall if your 4K HEVC HARDWARE encoding when using Vegas with the 6000 series had the same sawtooth pattern as in this picture. It most likely will be sawtooth but I am thinking more condensed , closer together. It would be hard to compare, On taskmanager I have VIEW/UPDATE SPEED set to High
@Former user I sanity checked with NVIDIA first to see if it matched your usage:
It looks the same for the most part, but my encode uses the 3D engine a lot more, it seems. That's despite the project having no filters applied at all.
Here's AMD:
Both appear to encode using the hardware the same way, which almost makes it seem like Voukoder doesn't support this particular AMD architecture, and instead defaults to using the built-in MAGIX HEVC engine. I could be wrong. Still, the problem exists with Voukoder that A- performance is identical, and B- the resulting bitrate is unusable.
AMD GPUs appear differently in the task manager. There were options for video decode and encode, but they didn't budge while the encode went on. Only the "video codec" showed usage.
I think I might need to reach out to AMD and NVIDIA to inquire about this a bit more.
Former user
wrote on 12/1/2021, 9:52 PM
AMD GPUs appear differently in the task manager. There were options for video decode and encode, but they didn't budge while the encode went on. Only the "video codec" showed usage.
hmm in my pic above only Decode shows, RTX 3090 but is that because i have AMD CPU?
NVIDIA cards show both encode and decode in task manager. If it's not showing decode it isn't being used for decoding, probably due to media type or settings in file i/o.
Why the 60-80 second difference in encoding time between the 6000 series AMD's and the 3000 series Nvidias?
I agree that this is odd behavior, considering NVIDIA's encoder has been revered for a while, but the same thing is seen in Premiere Pro's AVC > HEVC test:
I believe VEGAS uses CUDA for NVIDIA, not OpenCL, but I could be wrong. It is using NVENC, in any case. I gave Voukoder a quick test, but it isn't working properly for AMD. Here are the outputs:
I might be overlooking something, but I can't see in Voukoder where you'd set the bitrate. Using the default settings, the AMD bitrate is atrociously low, whereas NVIDIA is not. But... the AMD dialog box doesn't even have a preset available like the NVIDIA one does:
The encode time for Voukoder with AMD was essentially identical to MAGIX (but again, the output was junk). Here's the NVIDIA result:
NVIDIA GeForce RTX 3070 MAGIX AVC: 143 s MAGIX HEVC: 181 s Voukoder AVC: 97 s Voukoder HEVC: 97 s
In Voukoder you don't set bitrate you set rate factor. Try the recommended settings for x264 rather than defaults.
MEP PREMIUM just out of curiosity, i can change any settings if you think
just over 0.51secs, (the export box goes off so prob about 0.53secs)
MEP is impressive, 10 seconds faster then Resolve studio could do the transcode for me.
With your Vegas HEVC render I think your result is so much faster due to the high amount of CPU the HEVC NVENC render was using for me which I wasn't expecting, peaks that I could see around 70% but it's been my experience when you see 70% you can already be cpu bound, it's similar with GPU's. Your threadripper would have breezed through it. It's interesting how the shapes of the encoding spikes are different in the Vegas renders, Mine are entire shark teeth, whereas yours is 50% gums, 50% teeth, where you seemingly have a constant 50% GPU encoder activity although there maybe it not enough resolution to see
Former user
wrote on 12/2/2021, 3:02 AM
@Former user I sanity checked with NVIDIA first to see if it matched your usage:
It looks the same for the most part, but my encode uses the 3D engine a lot more, it seems. That's despite the project having no filters applied at all.
Here's AMD:
Both appear to encode using the hardware the same way, which almost makes it seem like Voukoder doesn't support this particular AMD architecture, and instead defaults to using the built-in MAGIX HEVC engine. I could be wrong. Still, the problem exists with Voukoder that A- performance is identical, and B- the resulting bitrate is unusable.
AMD GPUs appear differently in the task manager. There were options for video decode and encode, but they didn't budge while the encode went on. Only the "video codec" showed usage.
I think I might need to reach out to AMD and NVIDIA to inquire about this a bit more.
It looks to me that AMD hardware encoding works fine on Vegas, but not so with Nvidia hardware encoding and it's most apparent when you're not actually doing anything with the GPU or CPU that increases latency just transcoding, it's then a test of the hardware encoder, and any bugginess with the hardware encoder becomes obvious
That creates a big problem for you as a GPU reviewer, Is benchmarking Vegas a reasonable test of an Nvidia GPU, when we know there is a delay every 60frames which doesn't exist with AMD hardware encoder. Using Voukoder for Nvidia cards fixes the internal Vegas Hardware encoder problem but then are you now not being fair to AMD GPUs . It is worse you can't do a separate test with Voukoder due to it not working with 6000 series AMD cards, but even if you could, would you bother when it's a review of a product other than Vegas itself
This shows the delay every 60 frames where it appears Vegas stops everything, CPU and GPU processing, encoding and decoding and creates the spikes that don't happen with the AMD hardware encoding, all of that extra latency gives you results where AMD appears to be the better GPU @Deathspawner
@Former user That second paragraph makes me feel like you went straight into my head and pulled my thoughts out. There's a possibility Voukoder would be a worthwhile benchmark on its own, but obviously not right now if it's not using the AMD GPU correctly.
Out of curiosity, is it common that you would need to re-set Voukoder's chosen encoder every time you launch VEGAS? On each VEGAS launch, the default is x264 CPU, even though I had it set on one of the GPU encoders instead. Is every single encoder in Voukoder unique to Voukoder? Or does it source other projects (like ffmpeg) to implement?
I decided to do this same test with Premiere Pro, and despite NVIDIA still falling behind AMD in the AVC > HEVC test, there's no saw pattern like what was seen with MAGIX's encoder:
There's a lot that's strange here. I might ping VEGAS and NVIDIA both to inquire more about this. It'd be nice to see improved NVIDIA performance with what's basically a default encoder with VEGAS.