Does GPU rendering affect quality?

Marc S wrote on 12/2/2013, 6:37 PM
I usually turn off the GPU acceleration because of weirdness during editing but I noticed that Sony claims up to 3 times faster rendering of XDCAM EX footage using a GTX 570. Does anyone know if this will affect the quality of the render (scaling,effects etc.)? Quality is more important to me than speed and I seem to recall people saying quality suffers.

Thanks, Marc

Comments

Hulk wrote on 12/2/2013, 7:13 PM
Quality only suffers if you use Intel Quick Sync.

Performance increases are directly proportional to the number of GPU accelerated items in the timeline.
NormanPCN wrote on 12/2/2013, 8:38 PM
GPU accel should not affect visible quality. I say visible since the GPU code and the CPU code are not EXACTLY the same. It is possible that some things will have a color compute to 193 verses 192 withing a range of 0-255. In other words not really anything.

Regarding "render as". Different encoders will likely have different quality and I say this since the Mainconcept AVC, OpenCL and CUDA encoders are completely different designs than the CPU encoder and it is known that they (GPU) typically generate a lower quality than the CPU.

Sony AVC with Quicksync is a unique encoder than Sony AVC CPU.

Sony AVC with GPU will be very similar to CPU since only the motion search seems to be GPU accelerated. The output files will not be identical to the pixel.

Of course, EXACTLY what does "better" really mean? Encoder comparisons typically use PSNR or SSIM metrics to compare. A better result in these numeric metrics might not actually be visible to the human eye. If you cannot see it then should you really care.

Bottom line, your eye is your best true judge.
Marc S wrote on 12/3/2013, 12:37 AM
Thanks for the responses.
PhillB wrote on 12/4/2013, 3:58 AM
Yes there can be quality issues.
I found when rendering to Main concept AVC/AAC MP4 1920 x 1080 @ 25fps you can experience artifacts in transitions. This is visible noticeable to a degree that you would expect your client to notice. My source material was 1920 x 1080 @ 25fps from a Canon 5D MkIII
At the time I did a fare amount of research into this and the bottom line from graphics card manufactures seemed to be that this is a known issue with GPU acceleration. With some experimenting I found it can be improved by increasing bit rate. My fix for now is to leave GPU turned off when rendering.
musicvid10 wrote on 12/4/2013, 10:15 AM
Hardware acceleration always affects quality. GPU, CUDA, QuickSync, there is a tradeoff with any of them. I saw the PSNR/SSIM charts when this stuff was being introduced.

Whether the differences are "visible" or not is entirely up to the observer.

wwjd wrote on 12/4/2013, 10:38 AM
so... consensus for safest/BEST quality is to use CPU ONLY selection in the render box? I can live with that. Why do something that could possibly cause an artifact if you can do it so it doesn't? Except making quick low quality test renders of course
Laurence wrote on 12/4/2013, 11:13 AM
Count me as another one who sees a quality hit when using GPU renders.
NormanPCN wrote on 12/4/2013, 11:27 AM
so... consensus for safest/BEST quality is to use CPU ONLY selection in the render box? I can live with that. Why do something that could possibly cause an artifact if you can do it so it doesn't? Except making quick low quality test renders of course

Excepting bugs of course, I doubt you would see a difference with the Vegas video engine GPU usage. This is on your video prefs page.

As for encoders. Yes, the Mainconcept AVC OpenCL and CUDA, and Quicksync encoders are not as good as the Mainconcept AVC CPU encoder. They are younger/newer and less mature. It has nothing to do with GPU, they (their algorithms) are just not as good quality wise.

If, not as good, is a standard then we should not use the Mainconcept AVC or Sony AVC encoders and neither comes close to x264.

At high bitrates I have seen the MC AVC OpenCL encoder visually be as good as others and visually identical to camera source (GoPro 1080p30 20Mbps AVC) pixel peeping.

At lower bitrates nothing Vegas has is worth a penny and I use x264.
OldSmoke wrote on 12/4/2013, 11:34 AM
I don't see so much difference in MC AVC codecs but I do in the MPEG-2. Downsizing a HD project to DVD does give better results with the CPU codec.
I find that the MC AVC GPU codec has improved over VP11 and in my opinion does yield better results now depending on the bit rate selected.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

wwjd wrote on 12/4/2013, 12:28 PM
has anyone ever compiled THE list of the best to worst render types in Vegas? would be handy
johnmeyer wrote on 12/4/2013, 1:54 PM
has anyone ever compiled THE list of the best to worst render types in Vegas? would be handyThe reason Vegas and all other programs use so many different codecs is that each one is optimized for different uses. Some are best used for editing, and others for delivery. Some are optimized for very low bitrates (small file sizes) and others are optimized for really good quality without much regard to the size of the file. Some are designed to play well on devices that have very slow CPUs (like phones), and others assume the device has a very capable CPU. Some are designed for specialized CPUs, like those found in DVD and Blu-Ray players. Some are designed strictly for streaming.

The concept of "best render type" simply does not exist.

Over at the doom9.org forum, your thread will be removed if you ever ask for "what is best?" That rule seems draconian, but it is actually not a bad idea because it avoids lots of threads where people start arguing at cross purposes, each claiming that something else is best, or better, without ever realizing that they have not sufficiently defined the use, and even when they attempt to do so, there are always lots of variables that people forget to include, so the answer is never definitive. For example, in the short list of different rendering requirements I provided above, I forgot to include compatibility with a specific platform, like the need to render to MOV when delivering for the Mac.

When reading such threads, you end up realizing that both people are right, and that the whole thread is a wasted exercise.

So, bottom line: even though anyone can post a reply and make some claim, there actually is no answer to your question.

wwjd wrote on 12/4/2013, 2:35 PM
even just a handy guide that states: xxx is perfect for Blu-ray, yyy is best for small online stream, zzz is best for iPhone8 playback....

I understand they are all different and have their places, but even the info here saying one is young and not as good as this or that is very handy.
larry-peter wrote on 12/4/2013, 3:26 PM
It sounds like you're looking for anecdotal recommendations, which you'll find plenty of if you search the forum. Some you'll get consensus on, others not.
http://en.wikipedia.org/wiki/Comparison_of_video_codecs
is a good place to see what the capabilities of the multitude of codecs can be, i.e. just the various "profiles" for MPEG2 and MPEG4 are worth learning about.
wwjd wrote on 12/4/2013, 4:21 PM
probably, I tend to pick codec randomly depending on what it is going to be played on, not REALLY knowing the differences. MOV plays on mac, WMV is windows (yet both playback both, I think) MP4 seems good quality and and uplaod to vimeo/youtube etc

but not knowing which are the CLEANEST would be helpful. I assume the ones that max out at 8,000,000 are not as good as the ones that go to 35,000,000 etc

I'll check out that info posted
Hulk wrote on 12/4/2013, 4:24 PM
I assume we are talking about final delivery format here right? The final render to that someone would use to actually watch the video?

I prefer x264 for the small file size and good quality. I always frameserve to Ripbot using 2-pass encoding. It has been my experience that at a bit rate that shows very, very good quality video in Ripbot, any encoder in Vegas, at that same bit rate will show much lower quality.

And even Handbrake will start to breakdown and show macroblocking before Ripbot. I know they use the same encoder but the default settings on Ripbots are higher quality (and slower to transcode of course).

Just my two cents.

Now that I think about it that is one feature that would be nice in the next version of Vegas. A really high quality, 2-pass, x264 encoder that is well optimized for OpenCL.
NormanPCN wrote on 12/4/2013, 4:33 PM
Now that I think about it that is one feature that would be nice in the next version of Vegas. A really high quality, 2-pass, x264 encoder that is well optimized for OpenCL.


x264 is open source GPL license. Sony (closed source commercial product) cannot include x264 due to GPL. If x264 were LGPL that would be another matter.

I think there are ways around this. Sony could publish their encoder API and someone, like me, could write the x264 wrapper DLL and make that source GPL. This frees Sony from GPL restrictions. I don;t know GPL well enough but possibly Sony could write a make available a GPL wrapper encoder DLL and not distribute that with Vegas. Make you download it.

If you want to use AVI files, then there is an x264 setup out there for Video for Windows, but I think that setup is no longer actively developed.

Also interesting is that x264 in it superfast modes is a similar speed to GPU encoders at similar quality. Then x264 has it's high quality modes which nothing touches.

As for OpenCL, or any GPU encoder use (render as). Long GOP encoders are not a very applicable target for GPU coding. To get performance from GPU your algorithm must allow for massively parallel use. If you want to hear about this from, "the man" here is a link.

John_Cline wrote on 12/4/2013, 6:13 PM
The "OpenCL Acceleration of x264" video was FASCINATING, the implication is that any current implementation of encoding using the GPU will likely take a quality hit. I never use the GPU for encoding, I have a six-core processor and I'm much more interested in quality than speed. I'll just continue to encode using the CPU only.