I don't think it uses the gpu for decoding. None of the performance monitors show it using the gpu for playback and decoding. Not unless I use plugins that use the gpu.
Decoding is done using the CPU unless you have quick sync. Which us AMD Ryzen users do not have.
QSV is just the GPU in the CPU combined... so i dont understand how you come to the conclusion only quicksync would be used but not the standalone GPU. If you see, when QSV is active, that this gpu goes higher than lets say an GTX1060 its just normal, because most Intel HD GPUs are just scratching at the performance of midrange and high end GPUs by AMD or NVidia (unless you have one of the new Intel with VEGA GPU, those are quiet decent).
as en example, my desktop has an 6850K Intel, but no built in GPU, therefor i cant chose it in the hardware acceleration list. on the other hand, my notebook has an intel 6700hq, and this one has an GPU in it. so i can chose between my HD 530 or my GTX960M in the list.
so i dont see any indication that an other GPU than intel would not be used for decoding.
My understanding is Vegas only has QSV accelerated decoding. No QSV means all decoding is then done on the CPU.
Now, I'm talking about decoding. Timeline playback needs to decode a video AND render any effects for playback. That is the part the gpu then gets involved.
Hardware acceleration
Take advantage of hardware acceleration from modern NVIDIA and AMD graphics cards and
Intel's emerging QSV (Quick Sync Video) technologies to enhance real-time performance and
shorten rendering times for certain formats.
I believe that is for decoding streams for playback.
Vegas cannot work with interframe streams nor does it decode in real time. It does decompress each and every frame to raw uncompressed bits and stores them in memory, when the file is introduced to the timeline.
Therefore, hardware accelerated file decoding is not a very usable feature in an NLE, even if it could be employed somehow. It would not in itself improve preview performance, which I suspect is what you are wanting.
I believe that is for decoding streams for playback.
Vegas cannot work with interframe streams nor does it decode in real time. It does decompress each and every frame to raw uncompressed bits and stores them in memory, when the file is introduced to the timeline.
Therefore, hardware accelerated file decoding is not a very usable feature, even if it could be employed somehow.
not really. otherwise the features of projectsettings like resampling activated, deactivated or smart resampling would make no sense. at least i think that clearly plays part in it for me not only for framerate conversion.
but anyway, i at least clearly have way better performance if i activate any gpu in the HW acceleration instead of none, no matter if QSV or NVidia. so at least i can assume it uses either cuda or opencl to do the work of even just playback.
Resampling does not occur during decoding, but later in the encoding pipeline.
The "Project" is just a set of instructions. No actual video lives there.
PROOF: You can change the resampling mode in the project, media, or event properties after the media is loaded. It doesn't get "decoded" again. Think on it...
Vegas is not a player. Vegas must decode every pixel to raw bits in case you would like to edit them.
Your favorite player does not do that. Players, and presumably NVDEC, decode the stream, which is usually on the order of one actual video frame (i-frame) for every 300-400 frames of bidirectional predictive instructions, called "p," "b," and b-pyramid" frames respectively, the latter being fake i-frames. Imagine trying to edit air.
You can search for "interframe compression" and "video transport stream" if you would like to learn more, and there is plenty to learn.
Resampling does not occur during decoding, but later in the encoding pipeline.
Sorry for my rudeness, but that is nonsense. Research performance issues about resampling.
Everytime bits and bytes are getting played back, no matter what codec, colorform, colorspace etc. it is, it has by definition to be decoded so that you can understand what the bits and bytes are.
PROOF: You can change the resampling mode in the project, media, or event properties after the media is loaded. It doesn't get "decoded" again. Think on it...
That is no proof. You make it sound like decoding is only possible if something is loaded completely in the ram or so in one session.
Vegas is not a player.
Well not a "traditional" one for sure, but still it has a built in player, otherwise you would not be able to playback and see what you are doing. But sure, i would never play my 4k movies with it to enjoy them =)
I believe that is for decoding streams for playback.
Vegas cannot work with interframe streams nor does it decode in real time. It does decompress each and every frame to raw uncompressed bits and stores them in memory, when the file is introduced to the timeline.
Therefore, hardware accelerated file decoding is not a very usable feature in an NLE, even if it could be employed somehow. It would not in itself improve preview performance, which I suspect is what you are wanting.
Other NLE (Resolve to mention one) is using NVDEC to decode AVC and HEVC files. It works beautifully. So Vegas team should be able to implement it...
Former user
wrote on 4/12/2019, 6:39 PM
.
but anyway, i at least clearly have way better performance if i activate any gpu in the HW acceleration instead of none, no matter if QSV or NVidia. so at least i can assume it uses either cuda or opencl to do the work of even just playback.
That's not what the poster was asking. VegasPro only does hardware video decoding via Quicksync, it can not use discreet GPU. You will sometimes see a strange occurance where it does appear your GPU is doing video decode, but it isn't. Remove the intel gpu drivers & your video cards decoding ceases to exist.
Most video editors and transcoders use hardware decoding of your graphics card. Examples is Davinci Resolve, & handbrake, but not vegaspro
Sounds more like a feature request for native ffmpeg support in Vegas. Guess I won't follow there.
Not at all. That's a request for NVDEC support which is a Nvidia API.
It seems that Vegas already used hardware acceleration using Quicksync to decide video i streams. Other hardware decoding API can be used like NVDEC or its AMD equivalent.
It would be a great step forward for Vegas. Not everyone is having an Intel CPU with Quicksync.
Remove the intel gpu drivers & your video cards decoding ceases to exist.
as i wrote, my 6850k intel has no gpu, therefore no drivers. still if i activate hardware acclerating rtx 2070 clearly decoding is done by the gpu, i did not say i am sure it is the nvdec, but certainly it does do something with either cuda or opencl.
also again, were exactly is that written that only qsv is for decoding enabled in vegas? i tried some searching in the 600+ page manual, found nothing about that.
Former user
wrote on 4/12/2019, 8:46 PM
also again, were exactly is that written that only qsv is for decoding enabled in vegas? i tried some searching in the 600+ page manual, found nothing about that.
It will show you in task manager if hardware video decoding is being utalised by vegaspro. For my video card 'video encode' is NVENC, & 'video decode' is NVDEC
if i activate hardware acclerating rtx 2070 clearly decoding is done by the gpu, i did not say i am sure it is the nvdec, but certainly it does do something with either cuda or opencl.
You need a cpu to do encoding & decoding, A GPU doesn't have the complexity for such a task, it excels with video processing where massively parallel instructions of simple tasks can greatly speed up operations. Vegas must be written to access the specific dedicated decoding cores & it seems they stopped after quicksync support.
I have disabled the internal GPU of my CPU in bios for quite some time now, just to avoid possible conflicts with the nvidea card. Maybe I should re-enable this to have hardware accelerated QSV, although I am not quite sure if QSV silicon in the CPU needs the GPU of the CPU to be enabled in BIOS for it to work, probably it does, but I am not sure.
Vegas does NOT use your Nvidia or AMD card for playback of plain old footage. However, GPU acceleration IS used if you add GPU supported Vegas effects or 3rd party ones like Magic Bullets.
If you have a Quicksync enabled CPU then you can enable this and the QSV setting in vegas and vegas will now do video decoding using the CPU GPU instead of the CPU alone. If you run an AMD CPU then all video decoding is done via the CPU. All this does is leave the CPU cores a little more free for other stuff.
I enabled the 630 in my CPU via bios again, but I always see 0% in the video decode of windows task manager while the vegas timeline is playing when Intel or NVIDEA is selected.
This is with all sorts of not edited video, HD (50FPS) , 4K (25FPS) in different formats from "regular" (from RX10iii) to HVEC (from gopro).
Only when using a video player I see up to 50% decode usage in Nvidea with a HVEC from gopro.
@Former user weird that you can see any decode via Vegas, maybe I have driver issues with too recent drivers for both Intel and Nvidea with regard to the older compiled code from Vegas.
Former user
wrote on 4/13/2019, 7:42 AM
@bitman Just decided to check again after a Win update a few minutes ago and now all the states of HW acc. are playing back at full 24fps, ill delete my post, so as not to mislead. I may need something else that will slow down playback.
Not at all. That's a request for NVDEC support which is a Nvidia API.
... which must be supported by the codec library in order to be leveraged.
ffmpeg h264 and HEVC libraries support NVDEC . Handbrake and Resolve, the two mentioned, have ffmpeg. Vegas does not, ostensibly due to longstanding commercial licensing constraints. What otherdll's besides ffmpeg do you know that support the Nvidia decoding architecture, that "could" be used in Vegas?
Again, this sounds like a codec feature request, not primarily a software design request, as it first seemed from the question. Perhaps if the question were framed that way, it might get noticed and wind up on a developer's wish list someday.
Now, if Vegas "could" build GPU decoding into one of it's existing libraries that "could" feed a RGB bitstream to the Vegas internal pipeline, is it really faster? If so, how much faster would it really be? Fast enough to justify development and licensing costs, which are passed on to the consumer? What are the quality implications of NVDEC? Remember, the road to GPL coexistence in Vegas is historically fraught with nettles and snakes, and Magix has taken only some tentative first steps toward clearing that path since the acquisition.
I redid using oldsmokes test clips. By changing the Project frame rate to 25 from 29.97, all playback was slower, which is what I wanted. Proxies not used. Three pieces used with an overlap croosfade.
I observed the peak values as it played in a loop and also did a screen grab for each of 3 different HW Acc. settings … None, Intel and Nvidia.
HW Acc.
None .. Max playback fps .. 17 Cpu max 100% 630 Max 55% Nvidia Max 21%
Intel .. Max playback fps .. 13 Cpu max 100% 630 Max 52% Nvidia Max 21%
Nvidia .. Max playback fps .. 14 Cpu max 100% 630 Max 41% Nvidia Max 13%
None
Intel
Nvidia
Since there are no VFX added it may be of limited use.