VP13 - External preview crash

Frans Meijer wrote on 5/26/2014, 7:44 AM
Vegas Pro 13 crashes when I try to activate the preview on the second monitor. this may be related to the setup, which might be slightly non-typical. My primary monitor is attached to a (crossfire) AMD 7750 GPU, while the secondary is attached to the on-board, Intel i3 based GPU. This setup is choosen to enable Intel QuickSync capture.

Comments

ushere wrote on 5/26/2014, 8:27 PM
similar set up but with gtx650 and hd4600. no problem at all.

however, i'm using them the other way around. ie. prim=gtx, sec=on board.

fuller exp: system boots on internal 4600, windows then sets prim as gtx.
Steve Mann wrote on 5/26/2014, 10:23 PM
Try turning off GPU support.
Frans Meijer wrote on 5/27/2014, 4:28 AM
Thanks for the replies.

Turning off GPU support (on the preview settings) did not solve the problem.

Unless I understand you wrong I have it setup like you; I have the two discrete AMD GPU's as primary and the onboard intel HD as secondary for preview. The system / bios does boot on the primary (the AMD).

If I find one I will try to attach a third monitor on the AMD card and use this as preview monitor
ushere wrote on 5/27/2014, 9:16 AM
i know it sounds like the usual nonsense, but it might help to know a bit more about your system...

I have the two discrete AMD GPU's

you're running an sli configuration?
OldSmoke wrote on 5/27/2014, 9:35 AM
AMD/ATI runs Crossfire, Nvidia does SLI.

Have you tried driver Catalyst 13.12? I use this driver with great success with a HD6970. Also disable Crossfire.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

Steve Mann wrote on 5/28/2014, 11:01 PM
"AMD/ATI runs Crossfire, Nvidia does SLI."

And Vegas uses neither. Only one GPU is ever used.
Frans Meijer wrote on 5/29/2014, 5:53 AM
Thanks for your attention.

Yes I am running two AMD cards in crossfire. The Intel i3 HD graphics is also active (and attached to a monitor) to enable Intel Quicksync for video-capture.

I have attached a third monitor, now two are attached to the discrete AMD card(s) and one to the mobo's connector, for the Intel HD. When setting previewing to the second AMD monitor it works just fine. Changing the preview to the Intel HD attached monitor will crash VP13.

So, somehow it doesn't like to use the Intel, or possibly can't handle mixed drivers, or something.
OldSmoke wrote on 5/29/2014, 8:38 AM
And Vegas uses neither. Only one GPU is ever used.

Sorry but that is not true. Vegas does use both and both are used for acceleration. I have 2x GTX580 in my system and 1x HD6970. I can select the HD6970 for general GPU acceleration and when rendering to MC AVC I can select CUDA. In this case, Vegas will render the fastest possible on my system, 27sec. for the SCS Benchmark Project and I can clearly see 50-80% load on both cards. I can also select one of the GTX580 as general acceleration and then CUDA for MC AVC. Again, both GTX580 are used with a load of 50-80% with a render time of 29sec; that is my runner up in render times. If I select OpenCL in MC AVC it is another 1sec behind. However, I suspect that the slower render times with the HD6970 comes from the fact that this card is in a PCIe x8 slot where else the two GTX580 are in a PCIe x16 slots. My motherboard can do 2x PCIe x16 but only 2x PCIe x16 and 1x PCIe x8. It is important to understand that only 2011 socket and 1360 socket can handle more then 1x PCIe x16, all other boards will always fallback to x8 when a second card is in a PCIe slot.

In neither of the above do I enable SLI. Vegas really doesn't like it and it should always be disabled or set to "Activate All Displays" in the Nvidia Control Panel. I don't have CrossFire setup but I would assume same applies and if I am the OP, I would disable it even if it means physically removing the CrossFire bridge.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

Frans Meijer wrote on 5/29/2014, 8:55 AM
There is no bridge, i am using AMD. Globally disabling crossfire would not be an option, I didn't put in two gpu's to only use one, but I can tune it on a per application basis, if necessary, and turn it off for Vegas.
OldSmoke wrote on 5/29/2014, 9:02 AM
That is exactly what I do with my 2x GTX58. I use SLI for other 3D applications but I switch it off for Vegas. Aside front he fact that Vegas doesn't like SLI, full power SLI does only allow for one monitor per card. Crossfire requires a bridge same as Nvidia's SLI to be full CrossFire, or are you using Hybrid CrossfireX?

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

Frans Meijer wrote on 5/29/2014, 9:12 AM
I suppose, the more recent cards didn't need bridges, one of the reasons I tried it as upgrade and it works quite well, for games.

I don't see more then 20% gpu usage on rendering (Sony AVC), and I find it quite slow, to be honest, about realtime is not quite what I expected given the fabulous results I've seen with Quicksync (which it can not find, btw.). There is still a lot of work to do for Sony.
OldSmoke wrote on 5/29/2014, 9:26 AM
I think you have to look deeper into your system. Sony AVC is certainly slower then MC AVC and you should not leave it on automatic either. In the Sony AVC template select "Render use GPU if available". Same for MC AVC, you have to select OpenCL and not leave it on the default "Automatic" setting.

You can read up on CrossFireX here http://en.wikipedia.org/wiki/AMD_CrossFireX. As you can see, you can use them without bridge but to have full performance you need a bridge. I personally never liked the IntelQuickSync thingy, never worked to my satisfaction and the rendered results where horrible. If I remember from my past playing around with Quicksync, it required an additional software to be installed to actually work; that software came with the motherboard and it may also require to boot into the iGPU rather then the discrete one.

On another note. GPU acceleration can only get you so far, a fast CPU, quad or hex core, 4K requires 8cores, is a must too or else it becomes a bottle neck.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

Frans Meijer wrote on 5/29/2014, 9:56 AM
I've seen no connectors for a bridge, afaik they can only be used without bridge. It works ok, at least I got near double framerates with Tomb Raider after the upgrade compared to before, which ran it's benchmark at about 60 FPS on Ultra. For gaming this is good enough.

Quicksync works good for me, with OBS I can capture at full resolution (1080) with very little effect on game framerates, it certainly beats trying to encode with the CPU. All that was needed was attaching a monitor and intel HD drivers.

As far as NLE hardware demands go, considering that a system like mine can render complex scenes at 1080p30 in better then realtime and simultaneaously encode to H.264 - with all the data passing between GPU's and from GPU to CPU and to another GPU, I am quite certain the NLE and codec developers have more then a little optimizing to do. Rather then point at the hardware on the user side they should be doing a better job programming because quite frankly, I think their implementations are quite crappy at the moment.
OldSmoke wrote on 5/29/2014, 12:56 PM
From your post I assume you are using a software to record your games. That is a totally different scenario from working with camera footage. Frame recorded video has by far less information compared to an actual video and is by far easier to process.

SCS has done an excellent job with GPU acceleration but have done a lousy job specifying what is needed to get it going. I can render in faster then realtime 1080 60p video footage to MC AVC with GPU acceleration. The first time SCS came forward to actually make more elaborate recommendations is with regards to 4K footage.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

Frans Meijer wrote on 5/29/2014, 2:36 PM
Yeah, I am using software, running on the same system that the game is rendering the 3D images, and compresses it with H.264. So why can't these codecs, who don't have to render the scenes? It's not the gpu pipe/encoder that is stressed with the Sony and MC codecs, and if the processor is the bottleneck it means they have trouble processing a 1080p30 , or transferring data from main to gpu and that means they are doing something not quite optimally.

The problem I had with the preview, as well as the fact that it knows Quicksync is available but can not properly initialize it for encoding also shows they still have some work to. A discrete GPU on an Intel platform is not an uncommon occurence.
OldSmoke wrote on 5/29/2014, 3:06 PM
As I mentioned earlier, NLEs are hardware intensive and that in a different way from video games. There more game suitable graphic cards then there are for NLEs. Your 2x HD7750 while good for gaming are not sufficient for proper GPU acceleration. As for Quicksync, I put the blame on the OS and motherboard manufacturers. Which i3 CPU do have, they have different type of iGPUs? I had a 3770K with a HD4000 and when I got it working in Vegas 11, it was nowhere near as fast and good as my GTX570.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

OldSmoke wrote on 5/29/2014, 3:07 PM
As I mentioned earlier, NLEs are hardware intensive and that in a different way from video games. There more game suitable graphic cards then there are for NLEs. Your 2x HD7750 while good for gaming are not sufficient for proper GPU acceleration. As for Quicksync, I put the blame on the OS and motherboard manufacturers. Which i3 CPU do have, they have different type of iGPUs? I had a 3770K with a HD4000 and when I got it working in Vegas 11, it was nowhere near as fast and good as my GTX570.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

Frans Meijer wrote on 5/30/2014, 12:05 AM
Real time rendering millions of vertices in a 3 dimensional grid is hardly less hardware intensive then processing millions of points in a 2D plane. But GPGPU programming is apparently still a relatively new field outside of gaming and research. The non gaming cards have the same GPU's and use the same tech as those used in gaming, the cards just come with better support.

Odd about your quicksync, everything I read about it says it is faster then the NVidea and AMD equivalents.