So at heart I tend to be an all around techie before diving into film school and IT.
I've been hoping to get a better grasp on why it is that Vegas requires, on the AMD side, a Radeon Pro card for 32-bit/ HDR projects when it's been a pretty open fact that the chips tend to be the same on both sides of the field (with Nvidia being found to literally solder off bits of the same chip between the now RTX and A series' as of late on top of software gimping)