Pixel format for Canon R5 CLOG3 10-bit Footage? (8-bit vs 32-bit)

Comments

MarkAnthony121 wrote on 2/17/2022, 10:22 PM

Do you mean 8 and 32 bit pixel format projects?

Yes sorry, changed.

The files open exactly the same in any combination of 8 or 10 bit projects, hardware (QSV) or software decoders.

This render uses QSV hardware decoder, 32 bit fullrange project, no transform, and the right half with Canon LUT CinemaGamut_CanonLog3-to-BT709_BT709_33_FN_Ver.1.1 Levels, Gamma, and some Grading was applied.

 

Yeah what a strange situation. I wish I didn't get this anomaly using Intel QSV so that I didn't have to make proxies. I'll keep investigating why this might be happening

Musicvid wrote on 2/17/2022, 10:42 PM

Try updating your QSV drivers.

Help->Check for driver updates.

MarkAnthony121 wrote on 2/17/2022, 10:52 PM

Na, everything is freshly updated on Windows 11. I wish it were as simple as a non updated driver.

RogerS wrote on 2/17/2022, 11:39 PM

Unless testing on a system with a 11th generation+ Intel CPU with Intel GPU, you aren't testing the QSV hardware decoder with HEVC 10-bit 4:2:2.

Handy chart that refers to Premiere but should be the same for Vegas:
https://www.pugetsystems.com/labs/articles/What-H-264-and-H-265-Hardware-Decoding-is-Supported-in-Premiere-Pro-2120/

Mark, I wouldn't waste more time trying to figure it out- pretty clearly a decoding issue from the type of error seen with QSV. Either file a support request of wait to see if Vegas staff replies to this thread. Not many users are using Vegas with these most recent CPUs with such files or we'd see more complaints. Though if any are reading this far it would be good to get them to validate it fails on their 11th or 12th generation Intel CPUs, too.

Last changed by RogerS on 2/17/2022, 11:43 PM, changed a total of 1 times.

Custom PC (2022) Intel i5-13600K with UHD 770 iGPU with latest driver, MSI z690 Tomahawk motherboard, 64GB Corsair DDR5 5200 ram, NVIDIA 2080 Super (8GB) with latest studio driver, 2TB Hynix P41 SSD and 2TB Samsung 980 Pro cache drive, Windows 11 Pro 64 bit https://pcpartpicker.com/b/rZ9NnQ

ASUS Zenbook Pro 14 Intel i9-13900H with Intel graphics iGPU with latest ASUS driver, NVIDIA 4060 (8GB) with latest studio driver, 48GB system ram, Windows 11 Home, 1TB Samsung SSD.

VEGAS Pro 21.208
VEGAS Pro 22.239

Try the
VEGAS 4K "sample project" benchmark (works with VP 16+): https://forms.gle/ypyrrbUghEiaf2aC7
VEGAS Pro 20 "Ad" benchmark (works with VP 20+): https://forms.gle/eErJTR87K2bbJc4Q7

RogerS wrote on 2/18/2022, 12:12 AM

One other idea for a workaround is choosing a 10-bit AVC format in camera for now, assuming your camera has one with the other settings (resolution, framerate, etc.) you need. Vegas has supported those for longer and I remember trying a few out when the R5 and R6 came out.

MarkAnthony121 wrote on 2/18/2022, 12:20 AM

One other idea for a workaround is choosing a 10-bit AVC format in camera for now, assuming your camera has one with the other settings (resolution, framerate, etc.) you need. Vegas has supported those for longer and I remember trying a few out when the R5 and R6 came out.

Yes good point

Former user wrote on 2/18/2022, 1:11 AM

One other idea for a workaround is choosing a 10-bit AVC format in camera for now, assuming your camera has one with the other settings (resolution, framerate, etc.) you need. Vegas has supported those for longer and I remember trying a few out when the R5 and R6 came out.

Yes good point

I dont' think your camera does AVC 10 bit, that's why it was so controversial on release. Nothing could edit it.

It looks like you're using HEVC IPB Full Format, 4K HQ, you could try the ALL I version which will playback easier without using GPU decoder and so avoid the decoding bug. The bitrates are much higher, 470MB/s for HQ, 350MB/s for standard quality

MarkAnthony121 wrote on 2/18/2022, 1:33 PM

T'was a good idea but ALL I barely makes a difference. I only create wedding films. I suppose for the time being I can just work with the CLOG3 in an 8bit project despite not maximizing its potential. I don't do any ultra high end work

Former user wrote on 2/18/2022, 3:06 PM

Confirmed, The ALL-I format actually uses more CPU, and is less playable than IPB. It seems to be a completely pointless codec, compared to ALL-I 422 AVC as created by other cameras which is much easier to edit and shuttle through

I didn't use Vegas for the comparison as it would be pointless, Both clips use a tiny amount of CPU, and behave like slideshows

Howard-Vigorita wrote on 2/18/2022, 3:22 PM

@MarkAnthony121Been shooting clog3 a bit myself lately but not 4:2:2 as the performance is lower in Vegas than 4:2:0. But I didn't get messed up green tint when I tried it on an 11900k which decoded it OK. Just not as smoothly as you describe with your 11200k. I'd suggest going with the latest Intel graphics drivers using the Intel Driver & Support Assistant being that your igpu is fairly new. This seems to be the latest as of this minute: 30.0.101.1340.

Btw, I've been working with my clog3 footage by setting Vegas to 8-bit limited during edit with legacy hevc un-checked but switching to 32-bit limited with legacy hevc enabled for final render. And for some reason I've been getting nicer range distribution putting the LUT into the Look-side on the far right of the Color Grading Panel rather than the Camera-side on the far left... the upper end on the distribution looks less crushed to me in the scopes that way.

MarkAnthony121 wrote on 2/18/2022, 3:25 PM

Thanks I'll try that method!

Former user wrote on 2/18/2022, 4:02 PM

@MarkAnthony121Been shooting clog3 a bit myself lately but not 4:2:2 as the performance is lower in Vegas than 4:2:0. But I didn't get messed up green tint when I tried it on an 11900k which decoded it OK. Just not as smoothly as you describe with your 11200k.

@Howard-Vigorita could you try this file 422 1080P60, Interested if you can play at 60fps in BEST/FULL with your Intel decoder, and report cpu use?

https://www.dropbox.com/s/jrif74cmeqpla3f/R5T_9417_Video_Cut.MP4?dl=0

 

MarkAnthony121 wrote on 2/18/2022, 4:14 PM

@MarkAnthony121Been shooting clog3 a bit myself lately but not 4:2:2 as the performance is lower in Vegas than 4:2:0. But I didn't get messed up green tint when I tried it on an 11900k which decoded it OK. Just not as smoothly as you describe with your 11200k.

@Howard-Vigorita could you try this file 422 1080P60, Interested if you can play at 60fps in BEST/FULL with your Intel decoder?

https://www.dropbox.com/s/jrif74cmeqpla3f/R5T_9417_Video_Cut.MP4?dl=0

 

Same exact situation as with my footage. Buttery smooth, imports correctly. Once I change project to 32bit the clip displays distorted. Then when I turn off QSV the clip displays correctly but is choppy

Former user wrote on 2/18/2022, 4:22 PM

I'm trying to work out if the new Intel IGPU's some how fix the HEVC problem where Vegas is limited to only using a small amount of CPU , around 20% on my CPU, then dumps frames because it gets overloaded. That video plays terribly on my computer but I don't have the GPU decoding that you guys have which potentially reduces the cpu use for decoding below the point it overloads... but maybe it is able to use more cpu and 12th gen CPU's are the answer to this problem, almost , just the 32bit bug

Former user wrote on 2/18/2022, 5:09 PM

@MarkAnthony121 One last question, can you playback 60fps 4K 422 IPB at 60fps in 8bit project?

If it's slide showing do the play/pause/play thing at the beginning to see if that helps

If you haven't tried this is a sample https://www.dropbox.com/s/p3ru2ko05u0bvei/R6_T3402_Cut.MP4?dl=0

MarkAnthony121 wrote on 2/18/2022, 5:30 PM

@MarkAnthony121 One last question, can you playback 60fps 4K 422 IPB at 60fps in 8bit project?

The clip you provided plays back very poorly and start/stopping doesn't help much. I recorded a 60fps clip on my R5 at 60fps 4K CLOG3 IPB which should be 422 and that footage plays through once buttery smooth, THEN starts getting choppy. However the R5 footage reclicking or start/stopping does get it going smoother though. Both tests were done in 8bit full range mode with legacy HEVC/AVC checked, then again with them unchecked and QSV selected. Same results. Vegas makes me sad when you're using a 12900K processor and footage is still clunky. Like come on

Former user wrote on 2/18/2022, 6:07 PM

. Both tests were done in 8bit full range mode with legacy HEVC/AVC checked, then again with them unchecked and QSV selected. Same results. Vegas makes me sad when you're using a 12900K processor and footage is still clunky. Like come on

My guess is that your 12900K with 422 GPU decoding via QSV helps to hide the HEVC decode problem, rather then fix it. Your CPU is still limited to using a small percentage when decoding HEVC, and if you were to conform that 60fps test file to 30fps or 24fps to play on a project timeline of the same frame rate it would work fine. At 24fps the GPU decoding is doing such a good job the cpu decode is kept very low, but at 60fps it overloads the limited amount of CPU cores HEVC decoding has access to.

Maybe VP20 will be the one

MarkAnthony121 wrote on 2/18/2022, 6:31 PM

. Both tests were done in 8bit full range mode with legacy HEVC/AVC checked, then again with them unchecked and QSV selected. Same results. Vegas makes me sad when you're using a 12900K processor and footage is still clunky. Like come on

My guess is that your 12900K with 422 GPU decoding via QSV helps to hide the HEVC decode problem, rather then fix it. Your CPU is still limited to using a small percentage when decoding HEVC, and if you were to conform that 60fps test file to 30fps or 24fps to play on a project timeline of the same frame rate it would work fine. At 24fps the GPU decoding is doing such a good job the cpu decode is kept very low, but at 60fps it overloads the limited amount of CPU cores HEVC decoding has access to.

Maybe VP20 will be the one

I'm glad you brought this up because I only tested 24 FPS. Then I would have filmed an entire wedding and filmed many portions in 60 FPS wondering what was going on.

Yelandkeil wrote on 2/18/2022, 6:46 PM

Let's focus on the image distortion. 
I'm happy that there's no problem occured by AMD Hardware for any pixelformat in VEGAS. 
As your hardware @MarkAnthony121 are from new generation, the problem can only be
1, the current Intelgraphic driver or 
2, its compatibility/conjunction with Nvidia. 

As for timeline performance of such source materials, codec with deep compressing + high resolution with max framerate and/or 10bit+422, not only VEGAS but also the other NLEs all have a long way to go. 
Here's my test project in 32bitfloating point (No ACES) fullragne environment, the preview is after dynamicRAM cached. 

-- Hard&Software for 5.1RealHDR10 --

ASUS TUF Gaming B550plus BIOS3202: 
*Thermaltake TOUGHPOWER GF1 850W 
*ADATA XPG GAMMIX S11PRO; 512GB/sys, 2TB/data 
*G.SKILL F4-3200C16Q-64GFX 
*AMD Ryzen9 5950x + LiquidFreezer II-240 
*XFX Speedster-MERC319-RX6900XT <-AdrenalinEdition 24.12.1
Windows11Pro: 24H2-26100.3915; Direct3D: 9.17.11.0272

Samsung 2xLU28R55 HDR10 (300CD/m², 1499Nits/peak) ->2xDPort
ROCCAT Kave 5.1Headset/Mic ->Analog (AAFOptimusPack 6.0.9403.1)
LG DSP7 Surround 5.1Soundbar ->TOSLINK

DC-GH6/H-FS12060E_HLG4k120p: WB=manual, Shutter=125, ISO=auto/manual
HERO5_ProtuneFlat2.7k60pLinear: WB=4800K, Shutter=auto, ISO=800

VEGASPro22 + XMediaRecode/Handbrake + DVDArchi7 
AcidPro10 + SoundForgePro14.0.065 + SpectraLayersPro7 
K-LitecodecPack17.8.0 (MPC Video Renderer for HDR10-Videoplayback on PC) 

MarkAnthony121 wrote on 2/18/2022, 7:28 PM

Let's focus on the image distortion. 
I'm happy that there's no problem occured by AMD Hardware for any pixelformat in VEGAS. 
As your hardware @MarkAnthony121 are from new generation, the problem can only be
1, the current Intelgraphic driver or 
2, its compatibility/conjunction with Nvidia. 

As for timeline performance of such source materials, codec with deep compressing + high resolution with max framerate and/or 10bit+422, not only VEGAS but also the other NLEs all have a long way to go. 
Here's my test project in 32bitfloating point (No ACES) fullragne environment, the preview is after dynamicRAM cached. 

Yes maybe I will play with the DynamicRAM a little

Howard-Vigorita wrote on 2/18/2022, 9:09 PM
could you try this file 422 1080P60, Interested if you can play at 60fps in BEST/FULL with your Intel decoder, and report cpu use?

https://www.dropbox.com/s/jrif74cmeqpla3f/R5T_9417_Video_Cut.MP4?dl=0

@Former user was able to play it with the laptop I have with me in VP19... specs with driver versions are: Dell XPS15-9570; i7-8750h 32gb (integrated Intel UHD-630 {.8935} & Nvidia GTX-1050Ti {511.65}), Win10.

Very jumpy trying to play 4:2:2 HD hevc 60 fps and absolutely no decoding happening. Plays at approx 2.5 fps.

But if I transcode it with ffmpeg nvenc encoder to 4:2:0 keeping all else the same, it plays best-full at almost 60 fps throughout with Intel hd630 utilization around 96% and cpu about 70%. I saw a slight bit of frame rate jitter, dropping to about 59.5 fps at moments, but the preview screen looks smooth as glass. File size dropped to 427 meg (216 mbps) even though I specified cbr 230 mbps bitrate. Script I used is:

ffmpeg -i "R5T_9417_Video_Cut.mp4" -c:v hevc_nvenc -pix_fmt yuv420p10le -profile:v main10 -rc cbr -b:v 230M -minrate 230M -maxrate 230M -bufsize 4096k -an -y out_420.mp4

Note that I also tried the libx265 codec. The output file was slightly smaller (412 meg/209 mbps) but takes about 5x longer to generate and plays not quite as well in Vegas, probably because my Intel igpu decoder doesn't like it as much. And this isn't a particularly powerful laptop. With libx265 it plays in Vegas 60fps most of the time but drops as low as 14 fps for instants.

Former user wrote on 2/18/2022, 11:41 PM
 

Very jumpy trying to play 4:2:2 HD hevc 60 fps and absolutely no decoding happening. Plays at approx 2.5 fps.

But if I transcode it with ffmpeg nvenc encoder to 4:2:0 keeping all else the same, it plays best-full at almost 60 fps throughout with Intel hd630 utilization around 96% and cpu about 70%. I saw a slight bit of frame rate jitter, dropping to about 59.5 fps at moments, but the preview screen looks smooth as glass. File size dropped to 427 meg (216 mbps) even though I specified cbr 230 mbps bitrate. Script I used is:

ffmpeg -i "R5T_9417_Video_Cut.mp4" -c:v hevc_nvenc -pix_fmt yuv420p10le -profile:v main10 -rc cbr -b:v 230M -minrate 230M -maxrate 230M -bufsize 4096k -an -y out_420.mp4

@Howard-Vigorita Thanks for trying that. I"m also interested in NVENC transcodes and problems with Vegas, I tried 2 transcodes, 1 x265, 1 NVENC HEVC. Result in this case is very similar, Both very smooth full frame rate, and recover quite well after edit points, but I also saw the slight frame jitter you describe, but mainly an NVENC thing.

This lead me to believe the NVENC encode was barely holding on to 60fps, so I changed project to 120fps, and conformed the transcodes (2x playback). Both transcodes played back smoothly 120fps, both frame rates showing slight jitter, how ever the frequency was much higher, with NVENC the frequency much higher than x265. Nvenc took longer to recover after edit points.

Pleasantly surprised by high frame rate 420 HEVC NVENC at 1080P

 

Howard-Vigorita wrote on 2/19/2022, 2:42 PM

@Former user Just tried my laptop's onboard Intel igpu to encode the 420 transcode with ffmpeg (-c:v hevc_qsv) and that worked just as well as the Nvidia. Am a little puzzled by my seeing better performance playing nvenc than you. Maybe because I was set to use the Intel igpu for decoding. Just switched it to use the Nvidia and got spotty playback with both qsv and nvenc encodes until a replay. But much better from x265 more like before. Beats me why that is.

Also ran timers on the transcodes and the ffmpeg Nvenc codec was actually 16x faster than x265. Qsv was only 13x faster... same ratio on both my laptop and quicker Nuc8 although gpu and cpu utilization was lower and run-times were quicker for qsv and x265. Btw, playback on the Nuc8 looked the same as my laptop if I used the hd630 igpu for decoding on both. But if I used the Amd Vega-M igpu to decode on the Nuc, no jitter at all on the playback rate... it was rock solid 60p. Wouldn't recommend that in practice however, if there were any fx, multi track, or transitions to deal with. Amd encoders cannot do 10-bit so I cannot compare that.

Former user wrote on 2/19/2022, 5:54 PM

@Former user Just tried my laptop's onboard Intel igpu to encode the 420 transcode with ffmpeg (-c:v hevc_qsv) and that worked just as well as the Nvidia. Am a little puzzled by my seeing better performance playing nvenc than you. Maybe because I was set to use the Intel igpu for decoding. Just switched it to use the Nvidia and got spotty playback with both qsv and nvenc encodes until a replay. But much better from x265 more like before. Beats me why that is.

I've definitely seen the better performance using x264 over NVENC h.264 when working at 4K60 AVC, with playback of NVENC being largely unplayable after edit points or even when starting to play the timeline. play/pause/play will often get it to play. X264 plays a lot better, but can start lagging and recovering at any point. I guessed the reason for this is NVENC is not perfectly CFR, there's the tinniest amount of drift, even though media info doesn't pick it up(could be wrong though)

In this test case however I was talking in absolutes, talking about the actual frame counter, it jitters more with the NVENC encode, in actual playback they play as smoothly, except the NVENC transcode takes longer to recover after the edit point. In the video, first clip is x265, 2nd NVENC.

EDIT: I mistakenly uploaded 90fps playback on 120fps timeline. This is 120fps on 120fps timeline