Understanding 32-Bit Hardware Requirements

GarrettMJKD wrote on 5/8/2023, 2:52 PM

So at heart I tend to be an all around techie before diving into film school and IT.

I've been hoping to get a better grasp on why it is that Vegas requires, on the AMD side, a Radeon Pro card for 32-bit/ HDR projects when it's been a pretty open fact that the chips tend to be the same on both sides of the field (with Nvidia being found to literally solder off bits of the same chip between the now RTX and A series' as of late on top of software gimping)

Comments

Reyfox wrote on 5/11/2023, 7:14 AM

I'd be curious as to what is the difference between a Radeon Pro to a "regular" Radeon card.

Newbie😁

Vegas Pro 22 (VP18-21 also installed)

Win 11 Pro always updated

AMD Ryzen 9 5950X 16 cores / 32 threads

32GB DDR4 3200

Sapphire RX6700XT 12GB Driver: 25.3.1

Gigabyte X570 Elite Motherboard

Panasonic G9, G7, FZ300

Wolfgang S. wrote on 5/11/2023, 7:29 AM

Do you refer to the technical requirements of Vegas?

https://www.vegascreativesoftware.com/us/specifications/

If you refer tot the fact that they wrote there for AMD

AMD/ATI® Radeon with 4GB and VCE 3.0 or higher (Radeon Pro series with 8GB for HDR and 32 bit projects)

but for NVIDIA only

NVIDIA® GeForce RTX or GTX 9XX series or higher with 4GB (8GB RTX series recommended for 8K)

then this is missleading. I run HDR projects with nvidia cards too on my desktop system and my laptop system and it works fine also for HDR (especially the OLED in the laptop works great with Vegas, since it can be set to HDR).

https://www.asus.com/de/laptops/for-creators/proart-studiobook/proart-studiobook-16-oled-h7600-12th-gen-intel/

More important: you also need an HDR monitor with true 1000 nits. What costs money.

More details see my signature. The desktop is a 3080 Ti with 12 GB ram, and in the laptop I use a 3070Ti with 8 GB ram.

Last changed by Wolfgang S. on 5/11/2023, 7:33 AM, changed a total of 2 times.

Desktop: PC AMD 3960X, 24x3,8 Mhz * RTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

Yelandkeil wrote on 5/12/2023, 1:38 AM

I'd be curious as to what is the difference between a Radeon Pro to a "regular" Radeon card.

A regular AMD card focuses peak power such as in gaming.
A pro AMD card focuses "regular" power for industrial design; it can at least carry 6-8 4k-monitors which is doubled in comparison to normal card (max 4 4k- or 1 8kmonitor), besides, its 10-bit color depth is constant because industrial design requires that.

Apropos peak power/value: my 300CD/m² HDR-monitor has 1500Nits as peak but such ill-disposed swapping can only yield an ill-complex.

Last changed by Yelandkeil on 5/12/2023, 1:47 AM, changed a total of 1 times.

-- Hard&Software for 5.1RealHDR10 --

ASUS TUF Gaming B550plus BIOS3202: 
*Thermaltake TOUGHPOWER GF1 850W 
*ADATA XPG GAMMIX S11PRO; 512GB/sys, 2TB/data 
*G.SKILL F4-3200C16Q-64GFX 
*AMD Ryzen9 5950x + LiquidFreezer II-240 
*XFX Speedster-MERC319-RX6900XT <-AdrenalinEdition 24.12.1
Windows11Pro: 24H2-26100.3915; Direct3D: 9.17.11.0272

Samsung 2xLU28R55 HDR10 (300CD/m², 1499Nits/peak) ->2xDPort
ROCCAT Kave 5.1Headset/Mic ->Analog (AAFOptimusPack 6.0.9403.1)
LG DSP7 Surround 5.1Soundbar ->TOSLINK

DC-GH6/H-FS12060E_HLG4k120p: WB=manual, Shutter=125, ISO=auto/manual
HERO5_ProtuneFlat2.7k60pLinear: WB=4800K, Shutter=auto, ISO=800

VEGASPro22 + XMediaRecode/Handbrake + DVDArchi7 
AcidPro10 + SoundForgePro14.0.065 + SpectraLayersPro7 
K-LitecodecPack17.8.0 (MPC Video Renderer for HDR10-Videoplayback on PC) 

Reyfox wrote on 5/12/2023, 4:50 AM

@Wolfgang S. I don't work with HDR since I don't have a monitor for that. But what puzzles me is needing a Radeon Pro card for 32bit projects. Why would that really be necessary? The RX series graphics cards can't work with 32bit video?

Newbie😁

Vegas Pro 22 (VP18-21 also installed)

Win 11 Pro always updated

AMD Ryzen 9 5950X 16 cores / 32 threads

32GB DDR4 3200

Sapphire RX6700XT 12GB Driver: 25.3.1

Gigabyte X570 Elite Motherboard

Panasonic G9, G7, FZ300

Wolfgang S. wrote on 5/12/2023, 5:02 AM

No, it is not necessary at all to stick to AMD GPUs for a 32bit projects. For sure, you can also use nvidia GPUs.

I would only recommend have at least 8 GB RAM in the GPU, what is also the technical specification for 4K.

Not sure where the sweet spot is, given the high speed development for new GPUs - a one and a half year ago I invested an nvidid RTX 3080 Ti with 12 GB RAM, but in my Laptop I have also a RTX 30780Ti with 8 GB RAM. And both systems work well for 32bit projects, but also for HDR projects (both in Vegas and Resolve, and Resolve uses the GPU RAM in a very great way).

If I would invest today, I would tend to go even higher, given the fact that KI requires more, and may be used more and more in our NLEs.

Desktop: PC AMD 3960X, 24x3,8 Mhz * RTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

vkmast wrote on 5/12/2023, 6:31 AM

KI = künstliche Intelligenz (Artificial Intelligence, AI). (Just in case someone was wondering.)

Wolfgang S. wrote on 5/12/2023, 6:42 AM

Thank you for this clearification - KI is the German wording, and may not have been clear for non-German speaker.

Desktop: PC AMD 3960X, 24x3,8 Mhz * RTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

GarrettMJKD wrote on 5/12/2023, 11:36 AM

I'd be curious as to what is the difference between a Radeon Pro to a "regular" Radeon card.

A regular AMD card focuses peak power such as in gaming.
A pro AMD card focuses "regular" power for industrial design; it can at least carry 6-8 4k-monitors which is doubled in comparison to normal card (max 4 4k- or 1 8kmonitor), besides, its 10-bit color depth is constant because industrial design requires that.

Apropos peak power/value: my 300CD/m² HDR-monitor has 1500Nits as peak but such ill-disposed swapping can only yield an ill-complex.

Now we're getting somewhere!

I had suspected it might be something along these lines. More "stable" 10 and 12 bit handling it seems?

Wolfgang S. wrote on 5/12/2023, 12:25 PM

I had suspected it might be something along these lines. More "stable" 10 and 12 bit handling it seems?

Hmm, one possibility is to compare the Radeon Pro with the Radeon card. The second possibility is, to compare the Radeon Pro with the nvidia card.

And here it becomes tricky. Simply, because very few of us with have a Radeon Pro in a similar system (=the same system) as an RTX card. So it is hard to perform a comparison.

My personal impression is, that the driver of the nvidia cards seems to be better then the driver of the AMD cards. True? Maybe, maybe not. There were a time where developer recommended AMD, but then came a time where developer recommended nvidia.

However you assess that: I would not think that the 10bit handling (forget 12bit in the preview, or do you have a 12bit monitor?) is better with AMD cards. I assume that it is better with nvidia, since I think that the driver are more robust (but on the other side we had issues with the latest nvidia drivers and Vegas, too), and they seems to spend more money in the driver development I think (measured on the number of new nvidia drivers published).

So my personal choice was nvidia, but also (if possible) a processor with an i-GPU (what is more important I think). Vegas works with both nvidia and AMD.

 

Beside all of that, I think it would be even more important to establish the 10bit preview again for I/O devices of Blackmagic - so for the Ultrastudio 4K pro and the Decklink 4K extreme (and the smaller cards too). While that seems to work for SDR, it does not work for HDR any more in Vegas. But that is an issue, because only this I/O devices are able to avoid an impact in the preview quality of the GPU and the OS. And this is important, because only with such an unit you can use a calibrated monitor in a perfect way (the alternative in Vegas today is to use the GPU for the preview, and here you have to profilate your monitor - what can be done for sure too, and is the cheaper solution, but less professional).

I tell you that because this is the most important point for any professional grading - and the 32bit workflow, based on ACES, is what we are talking here about.

If that is too complicated, forget it! :)

 

 

 

Last changed by Wolfgang S. on 5/12/2023, 12:31 PM, changed a total of 2 times.

Desktop: PC AMD 3960X, 24x3,8 Mhz * RTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

GarrettMJKD wrote on 5/12/2023, 10:20 PM

I had suspected it might be something along these lines. More "stable" 10 and 12 bit handling it seems?

Hmm, one possibility is to compare the Radeon Pro with the Radeon card. The second possibility is, to compare the Radeon Pro with the nvidia card.

And here it becomes tricky. Simply, because very few of us with have a Radeon Pro in a similar system (=the same system) as an RTX card. So it is hard to perform a comparison.

My personal impression is, that the driver of the nvidia cards seems to be better then the driver of the AMD cards. True? Maybe, maybe not. There were a time where developer recommended AMD, but then came a time where developer recommended nvidia.

However you assess that: I would not think that the 10bit handling (forget 12bit in the preview, or do you have a 12bit monitor?) is better with AMD cards. I assume that it is better with nvidia, since I think that the driver are more robust (but on the other side we had issues with the latest nvidia drivers and Vegas, too), and they seems to spend more money in the driver development I think (measured on the number of new nvidia drivers published).

So my personal choice was nvidia, but also (if possible) a processor with an i-GPU (what is more important I think). Vegas works with both nvidia and AMD.

 

Beside all of that, I think it would be even more important to establish the 10bit preview again for I/O devices of Blackmagic - so for the Ultrastudio 4K pro and the Decklink 4K extreme (and the smaller cards too). While that seems to work for SDR, it does not work for HDR any more in Vegas. But that is an issue, because only this I/O devices are able to avoid an impact in the preview quality of the GPU and the OS. And this is important, because only with such an unit you can use a calibrated monitor in a perfect way (the alternative in Vegas today is to use the GPU for the preview, and here you have to profilate your monitor - what can be done for sure too, and is the cheaper solution, but less professional).

I tell you that because this is the most important point for any professional grading - and the 32bit workflow, based on ACES, is what we are talking here about.

If that is too complicated, forget it! :)

 

 

 

Complicated isn't the issue.

Especially when other NLE's don't seem to have this odd "requirement."

Nor was the question about one over the other, as opposed to why the asterisk for the one.

I could certainly see drivers being an issue, at one point especially for AMD/Radeon. A little "less care and craft??" for the game targeted drivers versus their pro drivers. Problem is, when the same chip is used in both the "Pro" and "Gamer" variants, that become questionable.

Add how Nvidia does the same, share the same chip between the GTX/RTX and Quadro/A Series/Teslas, while being quite open about gimping drivers on both sides as well as physically chopping of bits considering the target audience, it becomes hard to swallow in either direction.

It's not like Radeons aren't being used in professional setting. Those software houses were on stage with them for last few shows lol.

Seems more to me that Magix has some work ahead of them, as not every vender is requiring Pro variants on the Radeon side for 32-bit projects

Last changed by GarrettMJKD on 5/12/2023, 10:37 PM, changed a total of 1 times.

Dominions:

  • i1-4930K
  • Dual Radeon VIIs
  • 32GB Corsair Vengeance Pro
  • Asus ProArt PS278Q x2 + PA329CV

Vegas Pro, Premier Pro, After Effects, DaVinci Resolve, Studio One

RogerS wrote on 5/12/2023, 10:45 PM

I don't see these as hard and fast requirements- while the title says that beneath it characterizes them as "recommendations" and may reflect what environment the software has been tested in.

For NVIDIA there were actually problems with Quadro vs RTX in past years in VEGAS and no real benefit from the pro cards anyway.

Yelandkeil wrote on 5/13/2023, 1:00 AM

So at heart I tend to be an all around techie before diving into film school and IT.

I've been hoping to get a better grasp on why it is that Vegas requires, on the AMD side, a Radeon Pro card for 32-bit/ HDR projects ...

I'm not profi, too. But:

  1. Graphics cards using same chips will be deactivated this or that remarkable function by the normal ones; it's commercial strategy, in short, money.
  2. 32-bit has its full name 32-bit floating point, plz refer wiki here.
  3. The open ACES/HDR10 based on 32-bit floating point or the pure 32-bit floating point project requires much higher HW because of that floating calculation...
  4. For AMD card, they concentrate at the moment to join the old and the new cards into one general driver and their driver has been changed in some critical constructions which VEGAS didn't be aware till today.

-- Hard&Software for 5.1RealHDR10 --

ASUS TUF Gaming B550plus BIOS3202: 
*Thermaltake TOUGHPOWER GF1 850W 
*ADATA XPG GAMMIX S11PRO; 512GB/sys, 2TB/data 
*G.SKILL F4-3200C16Q-64GFX 
*AMD Ryzen9 5950x + LiquidFreezer II-240 
*XFX Speedster-MERC319-RX6900XT <-AdrenalinEdition 24.12.1
Windows11Pro: 24H2-26100.3915; Direct3D: 9.17.11.0272

Samsung 2xLU28R55 HDR10 (300CD/m², 1499Nits/peak) ->2xDPort
ROCCAT Kave 5.1Headset/Mic ->Analog (AAFOptimusPack 6.0.9403.1)
LG DSP7 Surround 5.1Soundbar ->TOSLINK

DC-GH6/H-FS12060E_HLG4k120p: WB=manual, Shutter=125, ISO=auto/manual
HERO5_ProtuneFlat2.7k60pLinear: WB=4800K, Shutter=auto, ISO=800

VEGASPro22 + XMediaRecode/Handbrake + DVDArchi7 
AcidPro10 + SoundForgePro14.0.065 + SpectraLayersPro7 
K-LitecodecPack17.8.0 (MPC Video Renderer for HDR10-Videoplayback on PC) 

Yelandkeil wrote on 5/13/2023, 1:09 AM

@GarrettMJKD please fill your HW info into your Signature so that we can go some concrete discussions.
And let these bullsh*t Theory aside.

-- Hard&Software for 5.1RealHDR10 --

ASUS TUF Gaming B550plus BIOS3202: 
*Thermaltake TOUGHPOWER GF1 850W 
*ADATA XPG GAMMIX S11PRO; 512GB/sys, 2TB/data 
*G.SKILL F4-3200C16Q-64GFX 
*AMD Ryzen9 5950x + LiquidFreezer II-240 
*XFX Speedster-MERC319-RX6900XT <-AdrenalinEdition 24.12.1
Windows11Pro: 24H2-26100.3915; Direct3D: 9.17.11.0272

Samsung 2xLU28R55 HDR10 (300CD/m², 1499Nits/peak) ->2xDPort
ROCCAT Kave 5.1Headset/Mic ->Analog (AAFOptimusPack 6.0.9403.1)
LG DSP7 Surround 5.1Soundbar ->TOSLINK

DC-GH6/H-FS12060E_HLG4k120p: WB=manual, Shutter=125, ISO=auto/manual
HERO5_ProtuneFlat2.7k60pLinear: WB=4800K, Shutter=auto, ISO=800

VEGASPro22 + XMediaRecode/Handbrake + DVDArchi7 
AcidPro10 + SoundForgePro14.0.065 + SpectraLayersPro7 
K-LitecodecPack17.8.0 (MPC Video Renderer for HDR10-Videoplayback on PC) 

GarrettMJKD wrote on 5/13/2023, 1:40 AM

@GarrettMJKD please fill your HW info into your Signature so that we can go some concrete discussions.
And let these bullsh*t Theory aside.

Well, I suppose it's the most I can hope for.....

Yelandkeil wrote on 5/13/2023, 1:59 AM

No intention to injure, how do you think of your HW?

-- Hard&Software for 5.1RealHDR10 --

ASUS TUF Gaming B550plus BIOS3202: 
*Thermaltake TOUGHPOWER GF1 850W 
*ADATA XPG GAMMIX S11PRO; 512GB/sys, 2TB/data 
*G.SKILL F4-3200C16Q-64GFX 
*AMD Ryzen9 5950x + LiquidFreezer II-240 
*XFX Speedster-MERC319-RX6900XT <-AdrenalinEdition 24.12.1
Windows11Pro: 24H2-26100.3915; Direct3D: 9.17.11.0272

Samsung 2xLU28R55 HDR10 (300CD/m², 1499Nits/peak) ->2xDPort
ROCCAT Kave 5.1Headset/Mic ->Analog (AAFOptimusPack 6.0.9403.1)
LG DSP7 Surround 5.1Soundbar ->TOSLINK

DC-GH6/H-FS12060E_HLG4k120p: WB=manual, Shutter=125, ISO=auto/manual
HERO5_ProtuneFlat2.7k60pLinear: WB=4800K, Shutter=auto, ISO=800

VEGASPro22 + XMediaRecode/Handbrake + DVDArchi7 
AcidPro10 + SoundForgePro14.0.065 + SpectraLayersPro7 
K-LitecodecPack17.8.0 (MPC Video Renderer for HDR10-Videoplayback on PC) 

Yelandkeil wrote on 5/13/2023, 2:51 AM

If no plan of updating HW in the near future, concrete for VEGAS:

  • Stay in the 8-bit full range environment;
  • Do fullHD project, max 2k;
  • Avoid 4k source material especially those 10bit422HEVC footage;
  • Abandon any 32-bit floating try.

-- Hard&Software for 5.1RealHDR10 --

ASUS TUF Gaming B550plus BIOS3202: 
*Thermaltake TOUGHPOWER GF1 850W 
*ADATA XPG GAMMIX S11PRO; 512GB/sys, 2TB/data 
*G.SKILL F4-3200C16Q-64GFX 
*AMD Ryzen9 5950x + LiquidFreezer II-240 
*XFX Speedster-MERC319-RX6900XT <-AdrenalinEdition 24.12.1
Windows11Pro: 24H2-26100.3915; Direct3D: 9.17.11.0272

Samsung 2xLU28R55 HDR10 (300CD/m², 1499Nits/peak) ->2xDPort
ROCCAT Kave 5.1Headset/Mic ->Analog (AAFOptimusPack 6.0.9403.1)
LG DSP7 Surround 5.1Soundbar ->TOSLINK

DC-GH6/H-FS12060E_HLG4k120p: WB=manual, Shutter=125, ISO=auto/manual
HERO5_ProtuneFlat2.7k60pLinear: WB=4800K, Shutter=auto, ISO=800

VEGASPro22 + XMediaRecode/Handbrake + DVDArchi7 
AcidPro10 + SoundForgePro14.0.065 + SpectraLayersPro7 
K-LitecodecPack17.8.0 (MPC Video Renderer for HDR10-Videoplayback on PC) 

GarrettMJKD wrote on 5/13/2023, 6:49 AM

No intention to injure, how do you think of your HW?

While my inquiry was more being curious despite being in the midst of part collecting, I'm far from new to this stuff.

I edit 4K on the regular, no problem.

I've even dabbled in HDR for clients, as well as myself a few times. That was painful, though I am missing 2 cores.... The Radeon VII is literally a rebadged MI50 Accelerator so sort of a compute monster, and carrying the weight them 2 missing cores are meant for in spades.

And 32-bit float point projects aren't exactly a stretch either.

fr0sty wrote on 5/13/2023, 8:32 AM

Best thing to do is to edit in 8 bit, then swap to 32 bit before you color and render. This is how VEGAS recommends you do it for best performance, when editing 10 bit HDR content.

As for Radeon Pro, I had a Radeon Pro VII and a Radeon VII in the exact same system for a while... zero performance difference in VEGAS. I then upgraded to an RTX 3090, and saw a modest improvement in performance in VEGAS... but nothing to write home about (mostly in plugins like Neat Video and timeline acceleration).

Last changed by fr0sty on 5/13/2023, 8:32 AM, changed a total of 1 times.

Systems:

Desktop

AMD Ryzen 7 1800x 8 core 16 thread at stock speed

64GB 3000mhz DDR4

Geforce RTX 3090

Windows 10

Laptop:

ASUS Zenbook Pro Duo 32GB (9980HK CPU, RTX 2060 GPU, dual 4K touch screens, main one OLED HDR)

Wolfgang S. wrote on 5/13/2023, 1:44 PM

@GarrettMJKD
So with your GPU monster I do not wonder that it is possible to edit 4K footage. Where have you seen the constrains for HDR?

Desktop: PC AMD 3960X, 24x3,8 Mhz * RTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

fr0sty wrote on 5/13/2023, 1:48 PM

I have a 3090 and still cannot smoothly play back 4k 10 bit 4:2:2 footage, so I don't doubt that one bit. That is why I use proxies when editing, and edit in 8 bit before coloring and rendering in 32 bit mode.

Systems:

Desktop

AMD Ryzen 7 1800x 8 core 16 thread at stock speed

64GB 3000mhz DDR4

Geforce RTX 3090

Windows 10

Laptop:

ASUS Zenbook Pro Duo 32GB (9980HK CPU, RTX 2060 GPU, dual 4K touch screens, main one OLED HDR)

Yelandkeil wrote on 5/13/2023, 2:28 PM

my constant project is 4k60p.
in 8bit full environment for cut/edit, I must judge whether my footage smooth or not in preview so that at media level eventually to add stabilization/motionblur etc.

my final output is HDR10, colorgrading in CGP.

But with some samples of goPro11 in cut/edit, the timeline even can't playback as supposed; so I quit to buy such a camera.


It's amazing that you edit 4k on the regular @GarrettMJKD!
could you share your project property and VEGAS/IO-settings?

 

-- Hard&Software for 5.1RealHDR10 --

ASUS TUF Gaming B550plus BIOS3202: 
*Thermaltake TOUGHPOWER GF1 850W 
*ADATA XPG GAMMIX S11PRO; 512GB/sys, 2TB/data 
*G.SKILL F4-3200C16Q-64GFX 
*AMD Ryzen9 5950x + LiquidFreezer II-240 
*XFX Speedster-MERC319-RX6900XT <-AdrenalinEdition 24.12.1
Windows11Pro: 24H2-26100.3915; Direct3D: 9.17.11.0272

Samsung 2xLU28R55 HDR10 (300CD/m², 1499Nits/peak) ->2xDPort
ROCCAT Kave 5.1Headset/Mic ->Analog (AAFOptimusPack 6.0.9403.1)
LG DSP7 Surround 5.1Soundbar ->TOSLINK

DC-GH6/H-FS12060E_HLG4k120p: WB=manual, Shutter=125, ISO=auto/manual
HERO5_ProtuneFlat2.7k60pLinear: WB=4800K, Shutter=auto, ISO=800

VEGASPro22 + XMediaRecode/Handbrake + DVDArchi7 
AcidPro10 + SoundForgePro14.0.065 + SpectraLayersPro7 
K-LitecodecPack17.8.0 (MPC Video Renderer for HDR10-Videoplayback on PC) 

GarrettMJKD wrote on 5/13/2023, 2:29 PM

@GarrettMJKD
So with your GPU monster I do not wonder that it is possible to edit 4K footage. Where have you seen the constrains for HDR?

Well, if we talk about HDR in particular, Outside of footage shot in ProRes or the odd high bit depth AVC (or creating HQ Proxies to work with), I get the laggy timeline, and crashes on render every blue moon, though that seems to be the case all around, even outside of Vegas

Yelandkeil wrote on 5/13/2023, 2:31 PM

my video on Ytube:

-- Hard&Software for 5.1RealHDR10 --

ASUS TUF Gaming B550plus BIOS3202: 
*Thermaltake TOUGHPOWER GF1 850W 
*ADATA XPG GAMMIX S11PRO; 512GB/sys, 2TB/data 
*G.SKILL F4-3200C16Q-64GFX 
*AMD Ryzen9 5950x + LiquidFreezer II-240 
*XFX Speedster-MERC319-RX6900XT <-AdrenalinEdition 24.12.1
Windows11Pro: 24H2-26100.3915; Direct3D: 9.17.11.0272

Samsung 2xLU28R55 HDR10 (300CD/m², 1499Nits/peak) ->2xDPort
ROCCAT Kave 5.1Headset/Mic ->Analog (AAFOptimusPack 6.0.9403.1)
LG DSP7 Surround 5.1Soundbar ->TOSLINK

DC-GH6/H-FS12060E_HLG4k120p: WB=manual, Shutter=125, ISO=auto/manual
HERO5_ProtuneFlat2.7k60pLinear: WB=4800K, Shutter=auto, ISO=800

VEGASPro22 + XMediaRecode/Handbrake + DVDArchi7 
AcidPro10 + SoundForgePro14.0.065 + SpectraLayersPro7 
K-LitecodecPack17.8.0 (MPC Video Renderer for HDR10-Videoplayback on PC)