best graphics card options for playback

Comments

BONLEV wrote on 1/22/2024, 11:12 AM

@John Forgot to mention these are all 50P media.

DESKTOP:

intel i7 14700K

MSI 790 gaming pro

SSD 2TB lexar 610

corsair 2x 32gb DDR5 5600

AMD radeon sapphire RX7800 XT (16gb)

CAMERA:

sony A7IV, sony pxw-z280, sony FX6, sony FS100

panasonic GH5S, panasonic DVX200, panasonic hx-x920

gopro hero 7 black

DJI mini 3 pro

VEGAS PRO 21 (build 208)

Howard-Vigorita wrote on 1/22/2024, 12:08 PM

This is an old thread from 2020. Subsequent posts indicate OP ended up getting an Amd 6900xt with an Arc a770 for decoding. I have that same combination myself. When display performance is my paramount consideration, I get the smoothest most glassy visuals doing multicam cuts between cameras displayed simultaneously if my monitor is plugged into the Arc. Which I temporarily select as the main gpu in video prefs. But that's without any FX of any kind. After camera cuts are done, with only crossfades between clips, FX and grading comes into play. I then switch my monitor and main gpu back to the 6900xt and keep it that way through final renders because the performance doing everything is so much higher. The less parallel timeline seems much easier for the Amd to handle smoothly, even with heavy fx processing. At that point the Arc is relegated to decoding and Qsv rendering. I wonder if an Nvidia 4090 in place of my Amd might obviate my needing to switch gpus for multicam... but it's significantly more expensive, harder to fit into a case, throws off allot more heat, and requires a bigger power supply. The 3080 and newer Nvidias, btw, were tops in AI Torture Test results.

Former user wrote on 1/22/2024, 6:31 PM

I wonder if an Nvidia 4090 in place of my Amd might obviate my needing to switch gpus for multicam... but it's significantly more expensive, harder to fit into a case, throws off allot more heat, and requires a bigger power supply.

@Howard-Vigorita Outside of USA where prices are more flexible (if a card isn't selling prices goes down, possibly rebated by Nvidia/distributor/card manufacturer) you can buy a 4080 for about $US40 more than the upcoming 16GB 4070TiS's MSRP. 4080 may continue to fall. US seems to be a different market where prices stick close to MSRP or even go above. 4090 is expensive everywhere due to China ban and server racks of 4090's being used for AI.

There are many 4080's that are between 2.5 - 3 slot, some 4070ti's that are 2 slot. My 3080 is 2.5 slot, Problem is most 4070ti,4080,4090 are big in all dimensions.

Howard-Vigorita wrote on 1/23/2024, 8:39 AM

@Former user I'm interested in this one at the moment. Relatively compact and the price isn't too bad. Likely because something newer is about to drop. I'll probably be retiring my old Xeon this summer and have to make some tough replacement choices.

BONLEV wrote on 1/23/2024, 11:25 AM

This is an old thread from 2020. Subsequent posts indicate OP ended up getting an Amd 6900xt with an Arc a770 for decoding. I have that same combination myself. When display performance is my paramount consideration, I get the smoothest most glassy visuals doing multicam cuts between cameras displayed simultaneously if my monitor is plugged into the Arc. Which I temporarily select as the main gpu in video prefs. But that's without any FX of any kind. After camera cuts are done, with only crossfades between clips, FX and grading comes into play. I then switch my monitor and main gpu back to the 6900xt and keep it that way through final renders because the performance doing everything is so much higher. The less parallel timeline seems much easier for the Amd to handle smoothly, even with heavy fx processing. At that point the Arc is relegated to decoding and Qsv rendering. I wonder if an Nvidia 4090 in place of my Amd might obviate my needing to switch gpus for multicam... but it's significantly more expensive, harder to fit into a case, throws off allot more heat, and requires a bigger power supply. The 3080 and newer Nvidias, btw, were tops in AI Torture Test results.

6900xt and I have a 7800xt. Both are about equal I read. But you switch to another GPU arc A770 for decoding?Because my 14700K has a Integrated GPU for decoding. I have 2 monitors. I hope I made the right choice.

DESKTOP:

intel i7 14700K

MSI 790 gaming pro

SSD 2TB lexar 610

corsair 2x 32gb DDR5 5600

AMD radeon sapphire RX7800 XT (16gb)

CAMERA:

sony A7IV, sony pxw-z280, sony FX6, sony FS100

panasonic GH5S, panasonic DVX200, panasonic hx-x920

gopro hero 7 black

DJI mini 3 pro

VEGAS PRO 21 (build 208)

Howard-Vigorita wrote on 1/23/2024, 1:29 PM

... you switch to another GPU arc A770 for decoding?Because my 14700K has a Integrated GPU for decoding. I have 2 monitors. I hope I made the right choice.

@BONLEV Yes, only because I got poorer hevc decoding performance from the 11900k/uhd750 using vp19 legacy-hevc decoding than from my 9900k/uhd630. The 14700k/uhd770 is a more efficient implementation. Vp21 now also has faster hevc decoding. Things should be way better. And the Arc should make a smaller performance difference unless they soup that up too... but the new model is not out yet.

Btw, here's a link to the: AI-FX benchmarks. Your setup would use the 7800xt for AI and media generation. I'm curious how it in fact compares to the 6900xt on those things. Not to mention compared to the 4090 which is the leader.

Former user wrote on 1/23/2024, 10:49 PM

@Howard-Vigorita That is a nice card, and a common sense approach too, keep majority of the GPU heat out of the case. 2 slot card so no problem with blocking slots, It also seems like nice silicon able to boost higher than many other 4090's while staying under 450watts.