AVC S open my eyes on how poor my GPU acceleration

megabit wrote on 3/22/2015, 5:17 AM
This is something I couldn't believe at first, but checked it enough times to know it's a hard fact. Please take a look at my systems specs,; you can see that relatively speaking, both my editing platforms have slow CPUs and "not the worst" GPUs (according to the consensus in this forum, and my own belief so far). So I tried Oldsmoke's 2 short XAVC S 100 Mbps 4K clips from the AX100 he posted, as well as their XAVC, 10 bit 4:2:2 mxf counterparts (created using Catalyst Browse). I observed the consumer "S" version barely touching 23.937 fps on my i7 system with what is considered "the last really working CUDA" card - my Fermi GTX 580; the same system only achieved some 15 fps with the mxfs... Well, I thought - time for a new system :( But then - out of curiosity - I turned my GPU card of in video preferences (I never did that before with my 1080p material as I was totally satisfied with my editing experience on this PC, and having believed the full fps I was getting was in large part the result of my GTX 580 card acceleration)...

Guys, was I shocked! Without any help (or should I say - interference) from the GPU, I got rock solid 24p at Best/Full with the original "S" clips (even with 3 FXs), and almost the same fps (only sometimes slower) with their mxf counterparts!

So, I tried the same with my System #2 (the same generation Sandy Bridge, 4 years old laptop) - and.... almost the same results after switching my Quadro M4000 off! Of course this was a bit slower - especially the full XAVC mxfs only played at some 20-21 fps - but this is just a laptop after all, and the CPU speed is rated 4x 2.7 GHz vs 4x 4.4 GHz on my slightly OC'ed desktop!

So really - I'm not sure about other formats, as with 1080p/25fps I mainly used for years I didn't complain - but with 4K XAVC (S) at 24p, both my CPUs are enough for quite convenient editing a simple project (no MC of course), while the mere presence of what I thought were competent GPU accelerators is slowing it all down to a crawl....

Now my questions is: where is all that power going, and why does it slow down pure-CPU performance so badly? After all, when the GPUs are used, the GPU-Z utility shows solid 80% of GPU load!

This makes me think some magic toggle within internal settings must exists, as this test defies all logic :(

Piotr

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

Comments

farss wrote on 3/22/2015, 6:11 AM
Why do you think it defies all logic?
There's no "magic" in a GPU and getting data to and from it uses bus bandwidth so if you've already got a fast CPU then forcing the code to use the GPU will slow the processing down. Remember the you're shifting data from RAM through the CPU to the GPU and back to the CPU with video. Some dedicated system do send the data from the CPU to the GPU and then directly to the display but that's not how Vegas does it as far as I know.

Bob.
megabit wrote on 3/22/2015, 6:31 AM
Sure thing, Bob - I've always been aware of the overhead with wrangling data - but i thought (judging on good experience with 1080p) that my particular cards (especially the GTX 580) has been just right for the i7-4600K CPU, and would accelerate the whole machine rather than slow it down...

Piotr

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

OldSmoke wrote on 3/22/2015, 7:37 AM
Piotr

If you get better performance with the XAVCS files without GPU then there is something wrong with your hardware or Vegas. Also, is your CPU a 4700k or a 2600K as your specs state?

Also, the MXF files play much better on my system then the XAVCS files and that is with with and without GPU.

You may want to try and reset Vegas to factory settings, careful, it resets any customization you may have done.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

megabit wrote on 3/22/2015, 7:54 AM
This is exactly what I suspect - but the situation is similar on my 2 totally different systems, so I don't think it's some Vegas internal setting that is the culprit here.

Oh, and it's just the i7-2600K, so the GTX 580 should really accelerate it efficiently...

Piotr

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

megabit wrote on 3/22/2015, 8:17 AM
OlsSmoke,

If you have 2 GPU cards not in SLI mode, how do you make Vegas use BOTH? You have to select ONE in the Properties, and only that one will be used - enlighten me if I'm wrong :)

Piotr

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

OldSmoke wrote on 3/22/2015, 8:38 AM
It is a bit tricky especially if both cards are same make and model. The trick is to select in preferences the card that is in the logically second slot. Even SCS doesn't know why it works but it does. With the R9 290 however it doesn't seem to matter, both cards always show load.

What driver are you using with the GTX580? I only trusted two drivers, 296.10 and later 334.89. 334.89 also worked under Win8.1 when I briefly used that OS together with my OpenCL Memory tweak.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

megabit wrote on 3/23/2015, 1:29 AM
Thanks OldSmoke, so it's an undocumented and unsupported feature, and I reckon may work with one mobo/GPU combination but not with another...

But really - I spent the whole day yesterday trying to figure out what can cause GPU to actually slow timeline playback, and am left out of ideas. I didn't participate in any of so numerous threads in this forum on how a GPU later than the original Fermi cards fail to accelerate Vegas, and are often switched off in Preferences by many VP users - just because with my prevailing format so faar (XDCAM EX 1080/25p), I clearly saw both my Fermi cards working (the Quadro in my Dell laptop as well as the GTX 580 on my i7-2600K desktop). Now - with the 4k XAVC-S 24p files - I guess I hit a threshold of resolution and compression beyond which both cards actually slow playback down! I wrote "resolution" and not bitrate, as even with 220 Mbps nanoFlash files (1080/25p like the EX1), I always felt positive difference with GPU on...

Something I don't get though is that even people with faster CPUs than my i7 (currently at 4.5 GHz) reported their Fermi cards to accelerate timeline playback - even with 4K material many of us started testing lately. Does anyone have any idea what setting I should try and change to get some acceleration?

It's important to me, as - with my plans to get rid of all my HD rig for reasons I mentioned here, and use VP for my own single camera hobbyist editing of the consumer XAVC-S codec - I only need ever so slight acceleration for a great editing experience (getting full 23.976 fps with CPU only, but without any serious effects, grading, cc or alike)...

Piotr

PS. What really bits me the most is that while my i7@4.5 GHz CPU - while not superfast in today's standards - is not a slow one, either, I'd never suspect that my M6600 laptop's i7@2.7 GHz (OK - it almost always runs in Turbo mode which for all 4 cores is 3.2 GHz) would be enough for full speed playback of XAVC-S in Vegas! And the funniest thing is that - with the same test clips - upon switching GPU "acceleration" on, Vegas slows down exactly the same amount (from 24 down to 15-ish fps) on both systems, even though also the GPUs are rather different....

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

OldSmoke wrote on 3/23/2015, 6:53 AM
Piotr

I think it's the way the XAVC codec has been implemented in Vegas. XDCAM is till an MPEG codec that has been around for many years and is very well implemented in Vegas. XAVC support first came in with VP12 but as you know, until today you can't use PXW-X70 files natively in Vegas. I am however surprised that the larger MXF files are performing worse on your system then on mine, maybe it's the better OpenCL support of the R9 290 in my system.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

farss wrote on 3/23/2015, 8:58 AM
[I]"Sure thing, Bob - I've always been aware of the overhead with wrangling data - but i thought (judging on good experience with 1080p) that my particular cards (especially the GTX 580) has been just right for the i7-4600K CPU, and would accelerate the whole machine rather than slow it down..."[/I]

I think you're missing my point.
Your assumption that the GPU will simply accelerate the machine is too general.
Using the GPU has costs and benefits.
In some scenarios the benefits can outweigh the costs and using the GPU will accelerate the process
In a different scenario e.g. 4K with a different codec, the cost / benefit ratio could easily change and the effect of using the GPU would be to slow the process.

Without detailed knowledge of what's imposing the costs and what the benefits are it's impossible to know what the outcome will be for every scenario, even a faster GPU or one with more RAM may yield no change in the cost / benefit ratio for all scenarios.


Bob.
.
megabit wrote on 3/23/2015, 9:36 AM
Your "it may be one way or another" statement is true, Bob - so you're right again :)

In fact, after lots of trials & errors, I finally did find one scenario where my GTX 580 actually does accelerate my oldish machine - and it's the Preview/Half mode with a bunch of FXes that slow down CPU-only playback to a 1-3 fps crawl. Turn GTX 580 acceleration on, and boom! - it is up at the full 23.976 fps again.

Quite different than with the other codex, where the acceleration effect of the same GPU was most pronounced in Best/Full!

Piotr

PS. If my only my 580 supported full 4k output resolution (or my mobo was good for 2 16x PCIe cards), I would be good for some basic XAVC-S edits...

PPS. When I think of it, my "discovery" could suggest that at Best/Full resolution, there is not enough VRAM on the card (but it has 3 GB, and GPU-Z only shows some 300+ MB of dedicated memory load). This make me think there must be some "resolution memory aperture" settings somewhere in VP internal settings that should upped for 4K... I tried the "OpenCL memory size filter" of 384 MB, but rising it didn't have any influence - maybe it only works with AMD?

PPPS: Preview/Half - isn't the resolution at 1920x1080, i.e. where the GTX 580 always shone? Hmm...

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

OldSmoke wrote on 3/23/2015, 10:43 AM
If my only my 580 supported full 4k output resolution (or my mobo was good for 2 16x PCIe cards), I would be good for some basic XAVC-S edits...

I wouldn't let that stop me from editing 4K. Previewing 4K as HD or 2K is sufficient to make all the necessary edits.

By the way, have you thought of how you will be viewing 4K? I mean how you will be delivering it?

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

megabit wrote on 3/23/2015, 10:55 AM
Good question, the last one.

But with my intended usage, I will be delivering it only to myself, so no problem at all (other than the lack of 4k resolution card) :) Disk space is so cheap these days.

Piotr

PS: Of course I'm still going to "professionally" edit 1-2 MC classical music DVD per year, for which my PC is good with up to 8 HD cameras, and those we're actually selling quite good. It's only 4k which I'm considering a hobby, and trying to squeeze out of it without any serious investments (a 4k monitor/QHDTV and the AX100 camera being the only investments I'm planning, and selling away my HD equipment I can't use anymore should buy me these - more or less).

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

NormanPCN wrote on 3/23/2015, 3:03 PM
I wouldn't let that stop me from editing 4K. Previewing 4K as HD or 2K is sufficient to make all the necessary edits.

Very true. I once posted a link here to a YT video about the production of the film, The girl with the dragon tattoo. A film done soup to nuts digital 4K.

The film was fully cut from 1080 media rendered from the RED source. After the cut, grading and effects were done on DPX image sequences.
videoITguy wrote on 3/23/2015, 3:19 PM
"deliver to myself" is really not an answer to the question. How do you expect to view 4k productions or even just footage in a 4k space - certainly not just the camera viewfinder, not possible at Vegas timeline preview window, and most problematically on scaled burns on Blu-ray channel?

What is your response, megabit?
megabit wrote on 3/24/2015, 2:50 AM
Well - I guess VLC is my response (for a while)?

Piotr

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)