Render to 4k make artefacts and random dropping frames

Comments

Wolfgang S. wrote on 3/3/2023, 7:31 AM

So you are not looking any more for a solution of your rendering, showing bloking artefacts? Fine for me.

For the rest: generally spoken, a 32bit workflow delivers a higher quality result, especially if you base that on 10 or 12bit footage (or better). So for sure, a 32bit workflow can help you to avoid banding, especially banding that may derive from color grading. So yes, that can work.

But it does not come for free, since nothing comes for free. You pay for that with a significant longer render time. You pay with that with using high-quality codecs (the use of long-GOP HEVC is the very low end to start, most productios run with All-I codecs and high quality formats like ProRes, or even RAW). So it takes the costs for high-end cameras, and you may end up with larger storage requirements. And you will end up with more powerfull PCs, and even faster storage medias (I have invested in a 10GB Lan, fast enough that I can edit raw-footage from my 10GB NAS).

And it takes you into the range of log-footage, what will require significant longer grading times in your postproduction.

Even if you keep it small, where is the beginning and where is the end? From a commercial perspective, especially for the wedding filmer, the real question is if the market is willing to pay for such a wedding video at the end of the day. Such a wedding video must be more expensive then the majority of the wedding videos, shoot in 8bit and produced in the minimum time required to cut that. Since most couples seems to be not willing to pay a significant amount of money (at least not here in Europe), most wedding filmers avoid to spend a lot of money for the necessary equipment ond workflow, is my impression. Especially in the States the situation seems to be different for an high-end and high-priced segment, is my impression. But I would recommend to be carefull here, and woud explore the market you can serve in some detail - instead of becomming technology driven.

Same is true for HDR - for most projects a lot of filmer try to shoot the footage in a way, where they can use it without a lot of time spent in the postproduction. As long as the customer is not willing to pay for the additional efforts based on a 10bit or 32bit workflow, they avoid to do that.

Results also in the other finding from the customers perspective: if you wish a wedding video, shoot in 10bit and graded to HDR, it was very hard here to find a wedding filmer who was willing to offer that. I found one for my wedding, but for sure it was more expensive and hard to find - since I knew exactyl what I wanted to have. But this seems to be a very small market segment, is my impression.

Desktop: PC AMD 3960X, 24x3,8 Mhz * GTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

Komaryt wrote on 3/3/2023, 8:02 AM

So you are not looking any more for a solution of your rendering, showing bloking artefacts? Fine for me.

For the rest: generally spoken, a 32bit workflow delivers a higher quality result, especially if you base that on 10 or 12bit footage (or better). So for sure, a 32bit workflow can help you to avoid banding, especially banding that may derive from color grading. So yes, that can work.

But it does not come for free, since nothing comes for free. You pay for that with a significant longer render time. You pay with that with using high-quality codecs (the use of long-GOP HEVC is the very low end to start, most productios run with All-I codecs and high quality formats like ProRes, or even RAW). So it takes the costs for high-end cameras, and you may end up with larger storage requirements. And you will end up with more powerfull PCs, and even faster storage medias (I have invested in a 10GB Lan, fast enough that I can edit raw-footage from my 10GB NAS).

And it takes you into the range of log-footage, what will require significant longer grading times in your postproduction.

Even if you keep it small, where is the beginning and where is the end? From a commercial perspective, especially for the wedding filmer, the real question is if the market is willing to pay for such a wedding video at the end of the day. Such a wedding video must be more expensive then the majority of the wedding videos, shoot in 8bit and produced in the minimum time required to cut that. Since most couples seems to be not willing to pay a significant amount of money (at least not here in Europe), most wedding filmers avoid to spend a lot of money for the necessary equipment ond workflow, is my impression. Especially in the States the situation seems to be different for an high-end and high-priced segment, is my impression. But I would recommend to be carefull here, and woud explore the market you can serve in some detail - instead of becomming technology driven.

Same is true for HDR - for most projects a lot of filmer try to shoot the footage in a way, where they can use it without a lot of time spent in the postproduction. As long as the customer is not willing to pay for the additional efforts based on a 10bit or 32bit workflow, they avoid to do that.

Results also in the other finding from the customers perspective: if you wish a wedding video, shoot in 10bit and graded to HDR, it was very hard here to find a wedding filmer who was willing to offer that. I found one for my wedding, but for sure it was more expensive and hard to find - since I knew exactyl what I wanted to have. But this seems to be a very small market segment, is my impression.

For fast respond - Yes I think that I fix artefacts on rendered files 😊 it Was a problem with nVidia drivers. I install older one and after that everything looks fine 😊 thank you for your time!

About the rest I will read it at evening because now I have fast job to be done you write lot of very interesting things which I need to read in peace 😊

Former user wrote on 3/3/2023, 8:01 PM
 

While I think that it is not true the the 32bit mode is not supported by the GPU, it is still a mode that requires a lot of calculation power.

@Wolfgang S. I was hoping that wasn't true, because the use of GPU is how Competition is able to achieve 32bit float internal processing by default instead of the 8bit default in Vegas, with 32bit float available if you're willing to tolerate the sluggish performance.

Unfortunately Vegas is using GPU, CPU use when turning GPU off for 32bit float doubles compared to 8bit. That's really disappointing... so Vegas is stuck in an 8bit world when it's competition is all 32bit, because even if you work in 8bit for timeline and render in 32bit float, as @Komaryt pointed out, it's too slow to be useable, and he has modern powerful hardware.

 

Wolfgang S. wrote on 3/4/2023, 2:43 AM

Todd,

some general comments:

We talk now about the general playback performance, that you will need to be able to edit and grade the footage in an appropriate way. You need a number of fps in the preview, to be able to do that.

Fair enough to say, that the HEVC decoding performance in Vegas may be lower then in Resolve. It depends also on the fps, that you apply in your shooting. For example, with FS7 XAVC-i footage with UHD 50p I have seen that it is hard to achieve the full fps in the preview. But with UHD 25p footage, you will achieve your 25 fps.

Your preview will depend on the decoder you use, and GPUs you have in your system. Especially for the GPU, please be aware that you can adjust the settings for the decoding and the GPU used in two places in Vegas, what I will show below

Important: what you will see here in Vegas, may depend from the footage you use. Please be aware, that HEVC will required dramatic higher decoding performance. If you know that, you may come to the conclusion that you will not want to shoot in HEVC at all - and you may switch to H.264.

But it is simply wrong to state, that in the 32bit mode the GPU is not utilized. Or that the hardware is not powerfull enough.

To give you an impression what I see here, even on my laptop (for the details see my signiture), please see what I see here with UHD 25p GH4 AVC footage. As shown in the media properties it is AVC

Since it was shoot in the GH4, I know from the camera settings that the footage will be full range (or at least 0..235), since you can set that in the camera. So I have set the media properties to full range for those medias.

Most of the parameters are also confirmed by the tool Media Info. What is not confirmed is the full range, but here Media info uses metadata only (but I know, what I have set in the camera):

You see here, that it is AVC, but no long-GOP structure. So it is an All-I H.264, 8bit UHD 25p.

You can edit that in the typical 8bit project settings

and since I know, that the combination 8bit video level and 32bit video level works fine here, I use this combination for the desired 10bit workflow (even if this test footage is 8bit only, but that is less important).

GPU settings are

and

With the 8bit project settings, I have with the preview best/full my full 25 fps for this PAL footage.

If I switch to 32bit floating point (video levels) project settings, without any ACES transformation since that is not required for the rec709 footage we use here (and not in the footage used by the thread starter), I have following settings:

In the playback behaviour, I still see in the preview quality best/full still the same 25 fps. So, no reason to see here a significant performance drop from changing from 8bit to 32bit.

How is the GPU utilization in both cases? Well, in the 32bit mode you see following picture (without grading or any fx applied):

So, CPU utilization is about 21%, what is for the 32bit mode really low. The Intel GPU supports the playback behaviour, so it is a quite good idea to have a processor with an i-GPU - here with 16%. And the RTX 3070 Ti with 8GB ram in the laptop version runs with 31%. So nothing at critial figures, and we see the full 25 fps in the best preview quality.

To answer your question, how that changes if you go to the 8bit mode, you find following picture:

And here I have taken a snapshoot where you see the change in the project settings form 32bit to 8bit (red marked, here the memory utilisation). You see also, that the RTX 3070 Ti utilizations of the GPU goes down from 30% to 23%. Does not look so significant in the picture, but this drop from 31% to 23% shows, that the GPU is utilized in the 32bit project settings in a higher way then in the 8bit project settings.

Even with some fast color grading with the color grading tool I see no significant changes in the 32bit mode. So enough reserves here:

Fazit: if you use the right footage - here H.264 All-I - you can also work in the 32bit floating point mode, even with the limited performance as I have it here on my laptop. The 32bit mode is supported by the GPU, but in a limited way only. Maybe that is the reason why you see here no huge support.

If I find the time I can try to get a similar exercise for H.265 footage on this machine (but this may take some days, since I have the footage not here available at the moment).

 

 

 

 

 

Desktop: PC AMD 3960X, 24x3,8 Mhz * GTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

Former user wrote on 3/4/2023, 10:11 PM

@Wolfgang S. That's a good educational post, but lets talk about this 32bit float slowdown.

Instead of more resources being used for 32bit there's less GPU and CPU use, and so less encoding speed There's a bottleneck somewhere slowing things down dramatically. Yes Vegas is using GPU for this, but it's performance gave me the impression it didn't.

Here's a comparison for a 1080P AVC 24fps (8bit encode)

VP20b326 8bit - 192fps

VP20b326 32bit - 87fps

Resolve 32bit - 411fps (bottle necked by encoder)

 

This is something that's going to be fixed eventually but currently it's not acceptable. There should be no slow down for 32bit, as is is the case with every other modern NLE, to such a degree there is no 8bit option because there's no need, 32bit is just as fast. *GPU required

Also I should mention although the 32bit Vegas encode is 87fps, the timeline frame rate would drift below the 24fps of the video.

 

And here's the dramatic difference, 32bit project, 10bit encode - 1080P AVC 24fps

VP20b326 32bit - 39fps

Resolve 32bit - 367fps

 

Wolfgang S. wrote on 3/5/2023, 2:02 AM

There will always be a slowdown for 32bit floating point, since this mode requirers significant more calculations. There is nothing for free.

Desktop: PC AMD 3960X, 24x3,8 Mhz * GTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

walter-i. wrote on 3/5/2023, 2:29 AM

There is nothing for free.

👍👍👍👍👍👍👍👍👍👍👍👍👍👍

3POINT wrote on 3/5/2023, 3:38 AM

There is nothing for free.

👍👍👍👍👍👍👍👍👍👍👍👍👍👍

There is, DR is for free....😁

Wolfgang S. wrote on 3/5/2023, 4:46 AM

But not the studio version, required for professional work like HDR.

Desktop: PC AMD 3960X, 24x3,8 Mhz * GTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

3POINT wrote on 3/5/2023, 5:19 AM

Did I mention the Studio version?

Former user wrote on 3/5/2023, 6:50 PM

There will always be a slowdown for 32bit floating point, since this mode requirers significant more calculations. There is nothing for free.

@Wolfgang S. Not for free, but almost free with the other NLE's because like GPU scaling (which Vegas seems to do well) GPU's are very efficient at this. Here's another example, Premiere same 2 minute 1080P24 AVC, 32bit project to 10bit encode.

Premiere, not known for speed is using 90% of the hardware encoder, Resolve almost 100%, Vegas 10%. I"m out of pro NLE's, I could show you the same on consumer NLE's, but by now you must see Vegas is stuck in an 8bit world, while the other NLE's evolved to perform efficiently at 32bit precision.

This is reminding me of the MagixAVC/HEVC hardware encoding. People have made excuses for it since it was released, it glitches, but people never thought of it as a Vegas hardware problem, but instead a limitation of hardware encoding. They conflated the lower efficiency and to a degree lower quality of hardware encoding with a Vegas hardware encoding bug. That's what I noticed when I first came here, this hatred and disgust for anyone using hardware encoding. So strange I knew there would be more to this story.

Excuses were also made for HEVC decode saying something similar to what you said about 32bit float. Only it turns out, that wasn't true either, it was another Vegas limitation mostly using a single CPU core to aid in GPU decode instead of multicore. Thankfully they appear to be working on that problem right now and next could be the 32bit float slowdown, but probably not if everyone keeps defending the lack of performance, normalizing it.

You can be a Vegas Superfan and still be critical of it's limitations.

3POINT wrote on 3/5/2023, 7:47 PM

You can be a Vegas Superfan and still be critical of it's limitations.

+1

Wolfgang S. wrote on 3/5/2023, 11:22 PM

Oh Todd, I am most critical with Vegas, and know quite well the shortcomes but even strength of this software. And be aware, that I am working with Resolve Studio too, especially to grade my 6K BRAW footage. I have explored the BRAW workflow in Vegas too, you find the results here in the tutorial area - and I have identified a lot of bugs in the BRAW workflow in Vegas.

So please do not tell me anything about to be a Superfan or that I am not critical about limitations. But the difference is, that I am fact oriented - see the „educational“ posting above. And it is a fact that I have shown you that the GPUs are utilized by 30% + 16% = 46% on my system, but you still tell me that the GPUs are utilized by 10% only. But I have shown you that this is not true.

And this 46% was for the relativ simple 32bit floating point calculation, without any ACES transformation. If you apply an ACES transformation, it goes up even higher.

The fact that Resolve utilizes the GPUs even better, is true. But as said, if you can grade a footage will depend on the type of footage too. Either you choose a type of footage that can be graded in an acceptable way, or you choose a footage like HEVC especially from some cameras that is hard to grade. Up to you. Even if I understand your frustration.

I cannot confirm that the development team will improve the GPU utilization for the 32bit workflow. While there are significant improvements on its way that I know for sure, I do not know what kind of improvements will be available when for the public.

Last changed by Wolfgang S. on 3/5/2023, 11:22 PM, changed a total of 1 times.

Desktop: PC AMD 3960X, 24x3,8 Mhz * GTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

john_dennis wrote on 3/6/2023, 2:07 AM

@Former user

I wonder why you have adopted the TODD-AO moniker.

Former user wrote on 3/6/2023, 6:04 PM

@john_dennis I'm a huge fan of the early widescreen technologies like Cinerama and TODD-AO. I even love the optical compression "distortion" caused by Panavision anamorphic lenses when pulling focus

GPUs are utilized by 30% + 16% = 46% on my system, but you still tell me that the GPUs are utilized by 10% only. But I have shown you that this is not true.

@Wolfgang S. I"m not doubting you, after you suggested it was done on the GPU, I tested myself, you're 100% correct. The reason I gave the GPU encoding percentages is because I couldn't give a FPS for Premiere as it doesn't show that. So gave the relative hardware encoder use figures, close to 100% encode Resolve, 90% encode Premiere, Vegas 10%. With my GPU using HEVC 10bit, a sustained 100% encode looks to be around 400fps.

I cannot confirm that the development team will improve the GPU utilization for the 32bit workflow. While there are significant improvements on its way that I know for sure

Well you know what's not here anymore? A frosty post that I knew was posted but never got to read. I suspect he said too much, and it's probably good news.

Wolfgang S. wrote on 3/7/2023, 12:57 AM

@Former user

please understand that it is up to the development team to state something about future development intentions, if they wish to do so. They do a lot in the background, and we all wish that Vegas is further improved.

I have shown you that Vegas utilizes the GPUs in my laptop to an extend, that I receive full playback during editing. This has been shown for AVC-I, what is basically H.264 footage. Frankly spoken, that is the target - to see a playback behavior that is good enough, to edit and to grade the footage. As long as that is given, it is fine for me.

If our threadstarter @Komaryt has the intention to work as a professional wedding filmer, then the best advice we can give here is to use formats that can be edited in our NLEs with good playback behaviour. And that is not HEVC to my opinion. I checked my own wedding video files shoot by another wedding filmer, and had also the impression, that they were shoot as HEVC with the Panasonic S1. I was wrong here - only some drone clips where shoot as HEVC. But all the other videos where shoot in 10bit H.264 long GOP with v-log, as I used to do that with my EVA1 too.

And if you shoot H.264 long GOP with log, the best way to edit that in Vegas would be to apply an ACES transformation in Vegas, and transform the footage to HDR10 or even rec709. Some guys apply also LUTs to perform that.

I will post later some more details about the grading of 10bit H.264 v-log long GOP footage, if I find the time. You will see, that the GPU utilization goes up there, but that the footage still can be edited in a great way in Vegas, even on the laptop that I use. But for sure, here you have less reseves available, compared to the All-I H.264 footage.

 

Last changed by Wolfgang S. on 3/7/2023, 1:08 AM, changed a total of 2 times.

Desktop: PC AMD 3960X, 24x3,8 Mhz * GTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

Wolfgang S. wrote on 3/7/2023, 6:48 AM

I did a test with EVA1 v-log 10bit UHD 25p footage, but I have seen that it was All-I H.264 footage.

So, you see in the media info the GOP structure N=1

Since it is a v-log v-Gamut footage, I adjust the input transformation to this type of footage:

Please be aware, that Mediainfo shows this footage as rec709. However, I know that it was shoot with the EVA1 and with v-log, since I have done that by myself. This is wyh I set it to Panasonic v-log v-Gamut.

Since I wish to apply a transformation of the log footage to HDR, I use the HDR10 project settings:

And the preferences are unchanged:

The GPU loading is higher here, since I apply an ACES transformation to HDR10. This is why I set the playback settings to best/half. For some clips the preview is then 25 fps, for some it is about 18-20 fps. With best/full, it would be about 10-15 fps.

GPU utilization is higher here:

so somewhere about 60% for the RTX 3370Ti and about 30% for the i-GPU. So, not so bad I think.

Desktop: PC AMD 3960X, 24x3,8 Mhz * GTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

Wolfgang S. wrote on 3/7/2023, 7:24 AM

I did that also with an EVA1 v-log 10bit UHD 25p LONG GOP footage. So, for this footage you find not any more N=1, but see the long-GOP M=1 N=12 in media info:

Again, the input transformation for the ACES transformation is set to v-log v-Gamut:

and the project properties are set to the transformation to HDR10, based on 32bit floating point (full):

and the other preferences are unchanged:

Since I expect here again a higher system utilization, I adjust again the preview capabilities to best/half.

With that settings, I achieve again my 25 fps, what is quite stable.

Given the ACES transformation to HDR10 AND in addition the long-GOP structure, one will expect the highest GPU utilization in this case, from all my examples I had up to now:

And that is true, since the RTX 3070 Ti is about 60 up to 70%, i-GPU about 20% and CPU utilization is about 40%.

In this snapshoot you also see the best/half preview, the ACES transformation to HDR10 and the 25 fps. In addition, the internal preview was set to HDR, since the laptop used is HDR capable. Memory utilization is high at 70% of the 32GB available.

So, all together the system can be used for this process without an issue.

I also wanted to see, what happens if I apply some test-color grading, what you see in the next snapshoot. And that is a funny result - if I enable the color grading, you see that the GPU utilization is reduced down to something about 40%, but also that the preview is reduced from the 25 fps to 15 fps. So, this could be better, since the GPU has enough headroom - and it should be possible to maintain the full preview performance:

 

 

 

Desktop: PC AMD 3960X, 24x3,8 Mhz * GTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

Former user wrote on 3/8/2023, 12:04 AM
 

I also wanted to see, what happens if I apply some test-color grading, what you see in the next snapshoot. And that is a funny result - if I enable the color grading, you see that the GPU utilization is reduced down to something about 40%, but also that the preview is reduced from the 25 fps to 15 fps. So, this could be better, since the GPU has enough headroom - and it should be possible to maintain the full preview performance:

@Wolfgang S. I also found ACES does a similar thing. I"m more interested in efficiency, rather than how much GPU processing is used (different to GPU encoder processing) ACES created a considerable slowdown.

I wanted to get away from AVC/HEVC decoding as a possible bottleneck, so tried some 4K prores422 from an Iphone13. For technical reasons I had to use 1440P project and encode, the problem was the GPU encoder should act as a 'bit bucket' it should not be responsible for a slow down. But at 4K Capcut saturated the encoder, also I wanted to use Voukoder so there is no slowdown, and that also required changing the project to 1440P, otherwise it would encode at 4K.

So this is a comparison of HLG HDR encoded to HEVC 10bit NVENC from Prores4K422 (1440P 32bit project)

No color grading is done, it's as default as I could make it. Vegas encode speed is tolerable, but there's no extra load from color grading, extra tracks and all the other adjustments and fx's you make in a typical edit. In the comparison Capcut does what I hope Vegas will be capable off.

You do a good job of how to edit on the timeline with 32bit, but the main complaint the user had was the overall slowdown of the encoding, and it's not a limitation of his hardware

 

 

 

Wolfgang S. wrote on 3/8/2023, 12:21 AM

For sure the encoding and the encoder speed should be improved in Vegas. It is fine for me if you use Prores. I did that too, but today I tend to use BRAW, but I also use also v-log and s-log from my different cameras. For BRAW I have come to the conclusion, that the actual implementation must be improved really, to become workable. For v-log and -slog - well, this can be done in Vegas in a good way really.

The complain about encoding was here about artefacts, caused by nvidia drivers.

For fast respond - Yes I think that I fix artefacts on rendered files 😊 it Was a problem with nVidia drivers. I install older one and after that everything looks fine 😊 thank you for your time!

The complains about the slow encoding comes mainly from your side, but before you complained about the slow encoder you complained about the ACES workflow and the low GPU utilization. While that all is fine for me, I wonder what you wish to achieve. The team works on improvements, but what else do you more expect at the moment?

Desktop: PC AMD 3960X, 24x3,8 Mhz * GTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

Komaryt wrote on 3/8/2023, 6:36 AM

Hello,
I had very hard days and I lost my mind - a nervous breakdown - because of that...

So the thing is. Banding which I had is not because of Vegas settings - just S5 II doesn't have good enough image quality to deliver video without this banding... official panasonic expert statement:

"In my opinion the 72 Mbps coding bitrate is not enough for a professional quality 4K resolution 10 bit color depth video data stream. It is very likely that the banding is generated because of this reason. DaVinci Resolve is a fantastic editing software, I believe it has an algorithm that can reduce the banding effect. Anyway I suggest to use a higher coding bitrate.”

For me it's just stupid saying about "use bigger bitrate" because I was using .mov 150mbps and 200mbps and banding was still there. I tried everything, changing project pixel depth to 32bit, rendering hevc to 10bit and it was still there. Where frame was stable it was not visible but when I move my camera to the left or right it was still there. Some ppl are saying that I use wrong project settings and render settings but for me and my job it is not profitable to edit everything in 32bit depth because 1 minute renders about 10minutes... And I had short period with HEVC format and my clients were not able to open HEVC h.265 files on their TVs so it's dead end for me. The smallest banding is in C4K format but making weddings in this format is suicide with disk spaces for me.

I am in contact with a shop maybe there is a chance that they could replace one month used s5 II to Canon R7 and everything will be alright. On Canon cameras I never had this issues even when image was in 8bit. I had 10bit images from R6/R5 and banding was not visible on it too and I didn't change pixel depth in project settings, I was making everything in 8bit. So for me Panasonic just made something wrong with their camera and they can't fix this... Even youtube has 8bit video so when I render everything in 10bit and upload to youtube there is still banding.

Reyfox wrote on 3/8/2023, 6:44 AM

@Komaryt could you possibly supply a good example of a file that has banding to the cloud?

Wolfgang S. wrote on 3/8/2023, 6:53 AM

@Komaryt

Good to see you back! :)

Seems to be that the S5 II was the wrong camera for you - maybe the S5 II X would have been better, given the better All-I codecs of this camera?

It is hard to overcome banding in the postpro, if it is in the original footage. Good to know.

Sure, I understand the commercial business limitations for using both the 32bit floating point workflow or to grade in Resolve - and as said earlier: I totally agree that HEVC is hard to use in an commercial environment.

The Canon R7 is an excellent choice I think.

Desktop: PC AMD 3960X, 24x3,8 Mhz * GTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

Komaryt wrote on 3/8/2023, 7:00 AM

Guys it is really hard to tell what I am feeling right now... When I just think about that I am crying like a child but not because of this banding - it caused many problems but S5 II just... And it explode inside me.

R7 was my first choice. I never had that issue with Canon cameras in 8bit video files, of course there was a lot of grain in shadows but I never see any banding there. The biggest problem is that I bought that cameras month ago and I saw this in first day of using but I thought that I set something wrong because it's new system etc. but my friend just confirmed my suspicions and after that I just breakdown... In saturday I had some small job to make and I just confirmed that S5 II was really bad choice. AF which would be so perfect like everyone was testing, was just meh... Can't track any point which I wanted to track, camera didn't even recognize point which I choose and other not good things about it.

Thank you all for trying to help me but there is no fix about this image. I hope that I will get better on mental breakdown and I will start thinking normally but now I am wreck of a man with thinking that I lost so much money...