Is it possible to allow my graphics card to help "render" the PREVIEW?

darnell-j wrote on 1/20/2020, 11:24 PM

The Problem.

Maybe "render" isn't the right word because it's not rendering but what I'm trying to say is, my graphics card isn't being used at all, UNTIL I start rendering.

I'm aware of the "GPU acceleration of video processing" setting, and I have it enabled. Yet, the setting only has an effect while I am rendering a video. It has no effect while I'm previewing/editing the video. Is this intentional? My GPU clocks at around 1% while previewing. My GPU is doing nothing until I start rendering the video; it seems like a waste of computing power to me.

What is the difference between "Optimal - Advanced Micro Devices, Inc. (gfx902)" and "Optimal - Advanced Micro Devices, Inc. (Ellesmere)" , if you could. I believe it could just be my internal graphics card and my external graphics card but I'm not sure.

Version:

Vegas Pro 17

Windows 10

PC build/parts:

-Ryzen 2400G

-AMD Radeon rx580

-16GB ram 2400hz

Thank you.

Comments

fr0sty wrote on 1/21/2020, 1:14 AM

Timeline GPU acceleration only accelerates effects applied to the video, and only on the effects that support GPU acceleration (which is most of them).

Unfortunately, AMD cards cannot decode video for timeline playback in Vegas, at least not yet (apparently there are multiple NLE's struggling to get AMD decode working right). Nvidia cards, however, can, and Intel CPUs have "quicksync" decoding that is also supported in Vegas. Unfortunately, again, your system is all AMD (like mine), so we don't get the benefit of video decode acceleration in Vegas yet.

If you want to see how much your GPU is accelerating things, add some text or some other effect to the video, then play it back with the GPU acceleration turned on, then off. Note the framerate the video is playing back during both tests, you'll see a significant improvement by having your GPU enabled (assuming it isn't slower than your CPU at the same task).

If you get an Nvidia card or Intel CPU, you can go into the file I/O tab of preferences and set your card to be used for video decoding. Keep in mind that hardware decoding, by its nature (not a vegas limit) can only decode HEVC or H.264 video streams (2 of the most common formats captured by cameras today), so if you camera shoots another format than that, you won't see a difference over CPU decoding.

Last changed by fr0sty on 1/21/2020, 1:16 AM, changed a total of 2 times.

Systems:

Desktop

AMD Ryzen 7 1800x 8 core 16 thread at stock speed

64GB 3000mhz DDR4

Geforce RTX 3090

Windows 10

Laptop:

ASUS Zenbook Pro Duo 32GB (9980HK CPU, RTX 2060 GPU, dual 4K touch screens, main one OLED HDR)

Former user wrote on 1/21/2020, 1:18 AM

Advanced Micro Devices, Inc. (Ellesmere) is your external graphics card. Make sure that is selected under GPU acceleration

darnell-j wrote on 1/21/2020, 4:57 AM

Big thanks to: lenard-p and fr0sty! Thank you both very much for your responses, they both have been great help!

StGemma wrote on 5/19/2020, 3:14 AM

Hello, where is the I/O tab in preferences, and the Advanced Micro Devices, Inc. (Ellesmere)? I do not see these. I am on Vegas Pro 16.

 

Thank you.

Marco. wrote on 5/19/2020, 3:30 AM

This is Vegas Pro 17 only.

Reyfox wrote on 5/19/2020, 7:01 AM

Hello, where is the I/O tab in preferences, and the Advanced Micro Devices, Inc. (Ellesmere)? I do not see these. I am on Vegas Pro 16.

In the Video Tab>GPU acceleration of video processing

Newbie😁

Vegas Pro 22 B250 (VP18-21 also installed)

Win 11 Pro

AMD Ryzen 9 5950X 16 cores / 32 threads

32GB DDR4 3200

Sapphire RX6700XT 12GB Driver: 25.5.1

Gigabyte X570 Elite Motherboard

Panasonic G9, G7, FZ300

Boris FX Continuum Complete 2025.5.1, Newblue FX Total FX360, Ignite Pro V5

Former user wrote on 5/19/2020, 7:28 AM

It is only Davinci Resolve that I have used that accelerates timeilne render/playback of default unaltered video using GPU 3D acceleration . It allows for smooth playback that's not possible at a given resolution/quality on other editors. So it can be done but isn't common for some reason

alifftudm95 wrote on 5/19/2020, 2:12 PM

Timeline GPU acceleration only accelerates effects applied to the video, and only on the effects that support GPU acceleration (which is most of them).

Unfortunately, AMD cards cannot decode video for timeline playback in Vegas, at least not yet (apparently there are multiple NLE's struggling to get AMD decode working right). Nvidia cards, however, can, and Intel CPUs have "quicksync" decoding that is also supported in Vegas. Unfortunately, again, your system is all AMD (like mine), so we don't get the benefit of video decode acceleration in Vegas yet.

If you want to see how much your GPU is accelerating things, add some text or some other effect to the video, then play it back with the GPU acceleration turned on, then off. Note the framerate the video is playing back during both tests, you'll see a significant improvement by having your GPU enabled (assuming it isn't slower than your CPU at the same task).

If you get an Nvidia card or Intel CPU, you can go into the file I/O tab of preferences and set your card to be used for video decoding. Keep in mind that hardware decoding, by its nature (not a vegas limit) can only decode HEVC or H.264 video streams (2 of the most common formats captured by cameras today), so if you camera shoots another format than that, you won't see a difference over CPU decoding.

Timeline GPU acceleration only accelerates effects applied to the video, and only on the effects that support GPU acceleration (which is most of them).

Unfortunately, AMD cards cannot decode video for timeline playback in Vegas, at least not yet (apparently there are multiple NLE's struggling to get AMD decode working right). Nvidia cards, however, can, and Intel CPUs have "quicksync" decoding that is also supported in Vegas. Unfortunately, again, your system is all AMD (like mine), so we don't get the benefit of video decode acceleration in Vegas yet.

If you want to see how much your GPU is accelerating things, add some text or some other effect to the video, then play it back with the GPU acceleration turned on, then off. Note the framerate the video is playing back during both tests, you'll see a significant improvement by having your GPU enabled (assuming it isn't slower than your CPU at the same task).

If you get an Nvidia card or Intel CPU, you can go into the file I/O tab of preferences and set your card to be used for video decoding. Keep in mind that hardware decoding, by its nature (not a vegas limit) can only decode HEVC or H.264 video streams (2 of the most common formats captured by cameras today), so if you camera shoots another format than that, you won't see a difference over CPU decoding.

GPU Acceleration work best with GPU base OFX right? (EXP: Use GPU base color curve instead of the normal color curve)

But with this features on, I have to set my dynamic ram preview to 0MB (Yes, Zero) in order to get rid of the flickering/black preview. (Someone said this in the forum before, and yes it work.)

I cant have GPU acceleration & Dynamic RAM Preview in the same time. Possible to have both features working simultaneously in the same time without having that weird black/flickering preview?

Last changed by alifftudm95 on 5/19/2020, 2:14 PM, changed a total of 1 times.

Editor and Colorist (Kinda) from Malaysia

MYPOST Member

Laptop

MacBook Pro M4 Max

16 Core CPU and 40 Core GPU

64GB Memory

2TB Internal SSD Storage

Anti-Glare 4K HDR Screen

 

PC DEKSTOP

CPU: Ryzen 9 5900x

GPU: RTX3090 24GB

RAM: 64GB 3200MHZ

MOBO: X570-E

Storage:

C DRIVE NVME M.2 1TB SSD GEN 4

D DRIVE NVME M.2 2TB SSD GEN 4

E DRIVE SATA SSD 2TB

F DRIVE SATA SSD 2TB

G DRIVE HDD 1TB

Monitor: Asus ProArt PA279CV 4K HDR (Bought on 30 August 2023)

Monitor: BenQ PD2700U 4K HDR (RIP on 30 August 2023)

 

 

 

j-v wrote on 5/19/2020, 2:41 PM

I cant have GPU acceleration & Dynamic RAM Preview in the same time.

Read in the manual what Dynamic RAM Preview does, when you use it and when you need it not or only a little or more.

met vriendelijke groet
Marten

Camera : Pan X900, GoPro Hero7 Hero Black, DJI Osmo Pocket, Samsung Galaxy A8
Desktop :MB Gigabyte Z390M, W11 home version 24H2, i7 9700 4.7Ghz,16 DDR4 GB RAM, Gef. GTX 1660 Ti with driver
566.14 Studiodriver and Intel HD graphics 630 with driver 31.0.101.2130
Laptop  :Asus ROG Str G712L, W11 home version 23H2, CPU i7-10875H, 16 GB RAM, NVIDIA GeForce RTX 2070 with Studiodriver 576.02 and Intel UHD Graphics 630 with driver 31.0.101.2130
Vegas software: VP 10 to 22 and VMS(pl) 10,12 to 17.
TV      :LG 4K 55EG960V

My slogan is: BE OR BECOME A STEM CELL DONOR!!! (because it saved my life in 2016)