Comments

Former user wrote on 10/13/2017, 11:07 PM

@ringsgeek Take a look at this thread: https://www.vegascreativesoftware.info/us/forum/faq-how-can-i-stop-vegas-pro-hanging-or-crashing-during-rendering--104786/

I'll check it out...thanks! But still looking for recommendations on the best card, honestly. For SVP15, NVidia GTX 1070ti a good fit?

I"m no expert, but AMD cards are known to be able to deliver a higher percentage of the processing power to Vegas for it's processing encoding needs, but the GTX 1070ti is such a powerful card & so reasonably priced (If coin miners don't inflate the retail price) then it should be an excellent choice except I don't know how vegas works. Does it need to know of your exact card. I don't think the 1070ti has been released yet, so that could be a problem & may not work until next vegas update.

astar wrote on 10/14/2017, 7:35 PM

@ringsgeek

I would solve the BSOD 1st. Any machine that BSOD has some serious low level issues like memory channel errors from running to fast, bad memory blocks, storage failures/errors, or driver issues. The BSOD will offer details as to the type of issue that was ecountered.

I would list the "media info" on the source media and the device type that originated it. Also list the details on the project settings, and render profile.

ringsgeek wrote on 10/14/2017, 9:22 PM

Found the issue. Bad wireless card driver. Don't need wireless on my desktop as I'm on LAN, so we disabled it and it hasn't crashed yet....

AVsupport wrote on 10/15/2017, 4:27 PM

Can anyone tell me what actual OpenCL version VP14/15 uses?

I'm also considering a Future Proofed hardware upgrade, and AFAIK, AMD 5xx are OpenCL2.0, nVidia 10xx (pascal) are 1.2, whilst Intel Kaby Lake is 2.1..what about Ryzen..?

https://www.khronos.org/conformance/adopters/conformant-products

what benefit is to be expected with what combo and whats the best guess recommendation?

 

my current Win10/64 system (latest drivers, water cooled) :

Intel Coffee Lake i5 Hexacore (unlocked, but not overclocked) 4.0 GHz on Z370 chipset board,

32GB (4x8GB Corsair Dual Channel DDR4-2133) XMP-3000 RAM,

Intel 600series 512GB M.2 SSD system drive running Win10/64 home automatic driver updates,

Crucial BX500 1TB EDIT 3D NAND SATA 2.5-inch SSD

2x 4TB 7200RPM NAS HGST data drive,

Intel HD630 iGPU - currently disabled in Bios,

nVidia GTX1060 6GB, always on latest [creator] drivers. nVidia HW acceleration enabled.

main screen 4K/50p 1ms scaled @175%, second screen 1920x1080/50p 1ms.

NormanPCN wrote on 10/15/2017, 7:35 PM

VP14/15 runs on Nvidia GPUs for timeline acceleration (video preferences). That should be enough of an answer.

I doubt Vegas will begin using an OpenCL version that eliminates Nvidia GPUs. I doubt Vegas will begin using an OpenCL version that eliminates a lot of older GPUs from being used. As everyone updates their drivers to support a newer spec, the older GPUs often get left behind by not supporting newer spec.

Bleeding edges specs are often not used for some time. Video games excluded. An app like Vegas might support a newer spec dynamically on the fly at runtime if it sees a newer spec supported. This assumes that the newer spec gives some benefit to a certain function. All this while working with a minimum/baseline spec.

AVsupport wrote on 10/16/2017, 3:47 PM

VP14/15 runs on Nvidia GPUs for timeline acceleration (video preferences). That should be enough of an answer.

I'm not sure if this is accurate. From what I see and read on these forum posts I take Timeline playback still heavily relies on the old OpenCL1.2, hence traditionally AMD GPU's been performing better than nVidia. Things do change when adding FX which most of them are based on OpenGL (which is different, again); Modern graphics cards from both sides easily support the latest standards 4.2, so there shouldn't be much difference. Hardware rendering (encoding) I believe is a different yet somewhat related issue, with some approaches relying on CPU/GPU supporting each other. nVidia hadn't seen much love in the latest VP editions, hence the new NVenc implementation, but I'm just not sure if it's good enough to give nVidia the upper hand in the future as OpenCL support is growing everywhere and here to stay. Hence it would be good to know exactly what Vegas uses, when. And, more importantly, where we are headed [implementation roadmap].

Last changed by AVsupport on 10/16/2017, 4:31 PM, changed a total of 1 times.

my current Win10/64 system (latest drivers, water cooled) :

Intel Coffee Lake i5 Hexacore (unlocked, but not overclocked) 4.0 GHz on Z370 chipset board,

32GB (4x8GB Corsair Dual Channel DDR4-2133) XMP-3000 RAM,

Intel 600series 512GB M.2 SSD system drive running Win10/64 home automatic driver updates,

Crucial BX500 1TB EDIT 3D NAND SATA 2.5-inch SSD

2x 4TB 7200RPM NAS HGST data drive,

Intel HD630 iGPU - currently disabled in Bios,

nVidia GTX1060 6GB, always on latest [creator] drivers. nVidia HW acceleration enabled.

main screen 4K/50p 1ms scaled @175%, second screen 1920x1080/50p 1ms.

Former user wrote on 10/17/2017, 9:03 AM

Also, AVsupport, you sound like you know your stuff. Do nvidia cards even accelerate encoding in any true sense or is it just hardware encoding, when you use NVENC. Hardware encoding is known to be bad/inefficient. For the same bitrate, hardware encoding such as NVENC will always be worse. Is that the case with the vegas use of Nvidia cards for encoding?

 

Could you also comment on how Vegas uses AMD cards. Is it a difference scenario, and it's not straight hardware encoding with the low quality results one may expect but it is real acceleration of software encoding?

I am wondering at the moment if everyone's excitement at NVENC magix/avc encoder is ill placed . Yes it is twice as fast (in case of my gtx1070) however for a given quality of encode perhaps you need twice the bitrate, or even then maybe it's not as good.

 

AVsupport wrote on 10/18/2017, 4:08 PM

@Former user, like most people here, I'd just like to find more and better information to base my purchase decisions on.

Re Hardware Encoding, I think it for the most parts has the upper hand with regards to Speed and Efficiency, hence we have dedicated processors in our Video Cameras that do just that, and not just a generic CPU with a software layer. It's like doing the thinking in the Neck, its faster but can't do everything; Hardware is always behind the latest in trends, that includes latest Codecs; Don't forget host software also has to be written to use HW acceleration first, too. So I guess you could have a situation where you end up having to use CPU+software encoding because GPU doesn't support the format. Newer codecs are usually more complex and CPU intensive, which in turn will hurt any CPU.

Also, sometimes memory interface speed between graphics card and CPU can be faster than other protocols interfacing with standard RAM, which could also give you a distinctive advanteage, but that will also depend on your system.

In case of NvENC, I wished Magix would put out a whitepaper or some info how it works; in some other rendering options I have seen a choice of different quality settings...go figure-?

This is a clear as mud, Hence I want to find out

my current Win10/64 system (latest drivers, water cooled) :

Intel Coffee Lake i5 Hexacore (unlocked, but not overclocked) 4.0 GHz on Z370 chipset board,

32GB (4x8GB Corsair Dual Channel DDR4-2133) XMP-3000 RAM,

Intel 600series 512GB M.2 SSD system drive running Win10/64 home automatic driver updates,

Crucial BX500 1TB EDIT 3D NAND SATA 2.5-inch SSD

2x 4TB 7200RPM NAS HGST data drive,

Intel HD630 iGPU - currently disabled in Bios,

nVidia GTX1060 6GB, always on latest [creator] drivers. nVidia HW acceleration enabled.

main screen 4K/50p 1ms scaled @175%, second screen 1920x1080/50p 1ms.

Former user wrote on 10/18/2017, 10:26 PM

@Former user, like most people here, I'd just like to find more and better information to base my purchase decisions on.

Re Hardware Encoding, I think it for the most parts has the upper hand with regards to Speed and Efficiency, hence we have dedicated processors in our Video Cameras that do just that, and not just a generic CPU with a software layer. It's like doing the thinking in the Neck, its faster but can't do everything

The reason I ask is because I have some experience with on the fly encoding using software such as Xsplit & OBS where video is encoded in real time & often streamed to a gaming platform such as Twitch or Youtube. In all the guides they recommend the use of X264 software encoding if your CPU is capable & only choose Intel Quicksync or NvENC hardware encoding if you have a slower computer as the hardware encoding will always be inferior for a given bitrate, with NvENC being lowest quality/least efficient encoding, Intel QSV being noticably of higher quality.

 

But this is 'live' 'realtime'on the fly encoding. I wonder if hardware encoding with non real time video editors is a different story due to the hardware being able to examine & process the video for longer duration or if the same rule applies, that being. Always choose software encoding for maximum quality, but if you have time limitations hardware encoding may be preferred.

Has anyone done side by side tests at lower encoding bitrates and noticed any differences?

 

AVsupport wrote on 10/18/2017, 11:13 PM

I would guess Surely with hardware encoders Speed is of the upmost importance, whilst with Software there's Choices and Flexibility like Multi-Pass that will definitely have impact on quality. Also, don't forget there's always cheap built-in and more expensive external (live stream) encoders like Teradek and Livestream, and I would assume there to be a difference as well; However, I'd hope that software rendering could work in conjunction with hardware acceleration to provide both, speed and quality ultimately ;-)

my current Win10/64 system (latest drivers, water cooled) :

Intel Coffee Lake i5 Hexacore (unlocked, but not overclocked) 4.0 GHz on Z370 chipset board,

32GB (4x8GB Corsair Dual Channel DDR4-2133) XMP-3000 RAM,

Intel 600series 512GB M.2 SSD system drive running Win10/64 home automatic driver updates,

Crucial BX500 1TB EDIT 3D NAND SATA 2.5-inch SSD

2x 4TB 7200RPM NAS HGST data drive,

Intel HD630 iGPU - currently disabled in Bios,

nVidia GTX1060 6GB, always on latest [creator] drivers. nVidia HW acceleration enabled.

main screen 4K/50p 1ms scaled @175%, second screen 1920x1080/50p 1ms.

HasanKhurshid29 wrote on 12/25/2017, 1:13 PM

Hi everyone

I am very impressed by the performance of VP15. I am using VP from it's version 8. Recently I was using VP13 which was better than it's previous versions in term of rendering and previewing using Nvidia Quadro K4000. But VP15 is on another level.

Previously rendering at 1080p at 10 mbps AAC it took 2.5 times more than the duration of the project with some brightness and contrast and color balance applied. But now it takes 0.9 times the length of the project when rendering through NVENC.

I am thankful to MAGIX for improving with great significance in the gpu acceleration. My specs are

Intel Core i7 2600 @3.40 Ghz 

12 GB DDR3 Ram

Nvidia Quadro K4000

1 Tb WD Black Hard Drive

Former user wrote on 12/25/2017, 6:55 PM

Previously rendering at 1080p at 10 mbps AAC it took 2.5 times more than the duration of the project with some brightness and contrast and color balance applied. But now it takes 0.9 times the length of the project when rendering through NVENC.

I am thankful to MAGIX for improving with great significance in the gpu acceleration. My specs are

Magix AVC hardware gives the lowest quality encoding performance of any of the codecs. Just as long as you realise that. Most people want the best possible quality. I think a box should show when first using this codec in hardware asic mode, that the quality will be lower but with benefit of speed.

 

Former user wrote on 12/26/2017, 3:58 AM

with the MAGIX NVENC encoding it's the noise in dark areas that stands out. I even found that to be the case in the 4k footage you uploaded showing.... from memory a gentleman gardening. It's most easily seen if you look at a dark area and play in slow motion. the noise changes from frame to frame. It is not obvious when just looking at a single frame

Former user wrote on 12/26/2017, 4:54 AM

I used your original footage, and encoded myself. using mainconcept, nvenc and qsv. I did not limit myself to lowest bitrate. I used a very high bitrate. The noise always remained. The noise was not due to a limited bitrate, but due to the NVENC asic codec. I did the same high bitrate encocode with mainconcept and the noise was not present, it looked like the original.

That's the main problem. The results of Magix Nvenc encoding are 'acceptable' but they create noise where noise did not exist, it creates an inferior image. I don't think it needs to be proven that current intel and nvidia hardware rendering is inferior to software, it's just a fact, only so much you can do with 7watts of encoding power. It's the tradeoff you make for speed over quality. Although you do get interesting discrepancies I think you said (from memory) that both qsv and nvenc hardware encoding were equally good at high bitrates, but for me qsv was noticeably the worst, nvenc better, and software noticeably superior.

 

NickHope wrote on 12/26/2017, 5:48 AM

I too would like to see some visual proof of this. At least some screen grabs, or better still, short rendered samples to compare (not re-encoded via YouTube etc. or the forum).

Former user wrote on 12/26/2017, 9:20 AM

I too would like to see some visual proof of this. At least some screen grabs, or better still, short rendered samples to compare (not re-encoded via YouTube etc. or the forum).

I don't recall the file server recommended so did upload to YT. It's enough to see the difference.
If you look at lower green foliage & walk path. noticeably higher noise. Also note this is 'high quality' setting for nvenc not default. Video is closeup of the right hand side of Cornico source footage played at orginal, slow & double speed. Slow is best for seeing the extra noise.

 

NickHope wrote on 12/26/2017, 10:27 AM

Any of these would do: Dropbox, Google Drive, OneDrive, mega.nz, wetransfer.com or mediafire.com.

Best to upload 2 separate short videos so they can be laid on separate tracks above the original for easiest comparison.