Monitoring 4k edits: 4k TV or PC monitor?

megabit wrote on 3/2/2015, 2:34 AM
As some of my Friends here on this forum know, due to my disability I had to quit using my 2 HD cameras (am in the process of selling them). Just turned 60, so I'm thinking about getting myself a future-proof (yes, 4k) camera instead for those beauty shots from time to time that would not require to carry it for a long time, but make it possible to continue my "serious hobby" videography has been in my life...

Of course I'l also need a faster computer to edit 4K, and above all - a 4k monitor. And here is a question: I've always used Vegas in 3 monitors configuration:

- the main monitor with the program interface and Preview Monitor window
- the 2nd monitor (full screen, configured as Windows Secondary one in Vegas) for my MC display; 20" has been just enough to enlarge the tiny preview window so that the action in all 6-10 cameras was visible clearly enough for cutting (take selection)
- the 50" plasma HDTV hanging on the wall in front and above me, angled slightly down, so that I could see the final HD output with FX'es added for grading etc. The plasma made it also possible to assess the rendered output of my projects, and inspect it closely - the distance of watching small enough for true pixel-peeping :) This short watching distance becomes even more of an argument when it comes to 4k.

Now my main question: assuming I want to keep the price down as much as possible, but not give up too much of the functionality - what would be better for me to buy for my 4k editing:

- one of those (now not so expensive) 4k LED TVs (like the Samsung UE50HU6900), which could replace the current HD plasma on the wall, or

- one of the 4k monitors, which in this price range are usually up to 32" at max (with 4k, not enough for pixel-peeping :))?

Interestingly, 4k monitors seem more expensive than TVs when it comes to 4k - even though in some aspects they are not as feature-rich (e.g. they usually are just 60 Hz refresh rate, while the TVs are rated at 2000 Hz). Yeah, I do realize the 2000 Hz quoted is a marketing slogan for some "movement fluency" gimmicks, but no matter what - I was after a 50" 4k monitor (the size of the Samsung model mentioned above), it would cost many times more than the TV!

So I do realize that for editing, all those gimmicks consumer products like those 4k TVs offer are unnecessary (or even more - unwelcome), but - just as I have been doing with my HD plasma so far - I hope they can just be switched off. So, from the price viewpoint the choice is obvious - a TV set like the one I mentioned is the way to go... Or is it, really? Please name a single (or several) reasons why I should buy a 4k PC monitor rather than a TV, and let us discuss it :)

Piotr

PSDisclaimer: please keep in mind that - considering my personal situation - price is unfortunately a very important factor in this choosing dilemma; otherwise it's obvious a professional monitor - not a consumer TV set - is the way to go...

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

Comments

megabit wrote on 3/3/2015, 12:52 AM
Anyone, please? I really need advise :)

Piotr

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

farss wrote on 3/3/2015, 3:30 AM
You seem to have answered your question in your own post :)

4K is spec'ed by Rec 2020, it calls for 10+ bit video at 100Hz amongst other things. The only monitors that'll display 10+ bit would be OLED and the cost of a 4K OLED monitor would be staggering if such a beast existed.
So if you really want a 4K display that makes any sense then all you can afford is a QHDTV.

How are you going to get QHD to the TV though?.

Bob.
megabit wrote on 3/3/2015, 8:36 AM
Good question, Bob - I'm not good in all those HDMI specs, so have not idea what type of graphics card should be used on computer end as a minimum to get 4k (and QHDTVs seem to only offer HDMI of various flavors).

On the other hand, DisplayPort 1.2 is enough to send 4k@60Hz to a 4k monitor, and all of them offer DisplayPort 1.2 connectivity (as do graphics cards).

So this is now my main question: how do I send 4k to a QHDTV?

Piotr

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

wwjd wrote on 3/3/2015, 8:57 AM
I went 4K monitor because A) I almost NEVER Watch any "TV" programming, and B) it quadrupled my video editing space in Vegas. I tell ya, Vegas with 4K landscape is a thing of beauty! And I am not dealing with any of the LAG associated with TV processing of almost every signal fed to it.

and C) I game on it sometimes too.

I'll get a large 4K TV someday prices are dropping, but my big HD TV still looks great for reviewing color, sound, and flow. Not world about 4K detail at that stage because I can see that on the 4k monitor.

my 5 cents
farss wrote on 3/3/2015, 1:37 PM
HDMI 2.0 supports 4K.

Our one and only 4K monitor, a BMD, requires 12G SDI, it doesn't support dual link SDI which the F5 outputs so we've had to add a SDI "combiner" to the monitor.
That said the BMD 4K monitor is really only a confidence monitor.

Bob.
megabit wrote on 3/11/2015, 6:18 AM
Well - the current Maxwell series of consumer nVidia cards (GeForce) delivers it all:

Dual Link DVI-I, HDMI 2.0, 3x DisplayPort 1.2 - meaning you can display 4K@60 Hz in full 4:4:4 color resolution on a QHD/UHD monitor using DP 1.2, and/or on a current "QHD TVs using HDMI 2.0.

All that is on the more expensive side, and the extra caveat being that Maxwell is not as good for CUDA acceleration of Vegas as its 2-generations back predecessor - the Fermi architecture. And it looks like owners of the good old Fermi cards (like the GTX 580 I have which is really very competent Vegas Pro accelerator) have no luck; the basic support for feeding 4K (at 4:2:0 ONLY - see [link=] http://www.anandtech.com/show/8191/nvidia-kepler-cards-get-hdmi-4k60hz-support-kind-of [/link]) over HDMI 1.4 only being possible with the proper driver on a GeForce card with the in-between architecture, i.e. Kepler...

Given that, I'm considering to leave my 580 as an accelerator and adding the cheapest model of the current Quadro line - the K420. It would be fine for full 4K on a 4k PC monitor using Display Port 1.2 - but does anyone know whether its DVI-I port can drive HDMI on at least 1.4 level like Kepler GeForce does?? Or will it be HDMI 2.0 compatible (it's Maxwell, after all)?

If you know please chime in and comment - Quadro being the professional division of nVidia, they won't even mention HDMI in the specs... TIA

Piotr

Edit Unfortunately, I read that the "budget Quadro" way - using the entry level K420 card - only offers 2560 x1600 @ 60 Hz through DVI-I. So no HDMI 2.0 capabilities, plus - even if it's probably fully HDMI 1.4-compatible - no 4k-4:2:0 hack, either (Quadro being a pro card, it most probably won't allow for compromising hacks like those allowed by GeForce). The bottom line being that - in order to make your PC compatible with either a 4K PC monitor (via DP 1.2), or a QHDTV (via HDMI 2.0) - you must invest in a Maxwell GeForce card... :(

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

wwjd wrote on 3/11/2015, 9:53 AM
im using display port from my nvidia titan for 4k 60hz
OldSmoke wrote on 3/11/2015, 11:00 AM
How about a R9 290/290X? Initially I was reluctant to switch from a Nvidia to AMD but I am very happy with my cards now and 4K is also supported.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

megabit wrote on 3/11/2015, 12:45 PM
Does the R9 290 support 4K through Display Port, or HDMI 2.0, or both?

Piotr

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

megabit wrote on 3/13/2015, 1:42 PM
Well, I checked and unfortunately the R9 290X would only help with a 4K monitor, as its HDMI is not yet 2.0 (so not QHDTV connectivity)

That said - and considering I'm seriously thinking to hop onto the 4K wagon in the cheapest way possible - I wonder what you guys think of the following plan I have (please keep in mind I only want 4K for strictly hobbyist application - single camera, not MC edits, no compositing - just an ability to produce some beauty shots of landscapes and alike). So the details of my plans are following:

- the camera: AX100 which already has the 100 Mbps X-AVC S 4K codec (no 50p is not a problem - I'm after semi-static, nostalgic shots with lots of details, accompanied by my self-created sound tracks); great "video-retirement" activity for a 60 years old, physically disabled person still professionally active in a totally different field...

- the editing PC: the most problematic part of the plan, as I'm not in a position for big investments here. Therefore, I'm considering to keep the System #1 (as per my profile details here) - with the Fermi GTX 580 card replaced by a pair of R9 290X'es, driving a 28-32" 4K monitor.

Now: I tested the AX100 25p 4K clips on this very system, and with the GTX 580 I'm getting a solid 18 fps at Full/Best. Now the most important question to the GPU-for-Vegas gurus on this forum:

How much do you think replacing the 580 with a pair of R9s would increase my preview experience on this machine? Is there a chance it could be elevated to full 25 fps at Full/Best?

I am thinking of just several "projects" of this sort per year, so rendering time is not a problem at all.

What do you think? Should the plan succeed, my Fermi GTX 580 card (Gigabyte model with 3 GB VRAM) would be for sale to those looking for this sort of a GPU for accelerating their Vegas renders :)

Piotr

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

megabit wrote on 3/14/2015, 4:50 AM
I could edit my previous post to add this, but adding a new post will put the thread higher at the Forum list so catching attention of someone who could answer my questions (or just express his/her opinion on the idea) will be more probable :)

So as I said - in my current situation, having spent the necessary amount of cash on the cheapest but good 4k camera will make replacing my current, i7 2600K-based, editing system rather impossible, and I'm trying to make it workable for basic edits of 25p, XAVC S 4k material from the AX100 camera....

Another possibility is leave the GTX 580 I have in this system (after all, it's a common consensus on this forum that being the high-end Fermi model, this card is probably the best for CUDA acceleration in VP13), and only add the cheapest possible card that could drive a 4K monitor @60 Hz. One such card is the low-end K420 model from the current Quadro line (with DP 1.2 providing full 4k resolution). But of course this solution would not increase preview speeds I'm getting now when testing AX100 clips (ca 18 fps at Full/Best), so I'd either have to live with it, or use proxies for elevating preview speeds...

Which of the 2 proposed solutions would you consider the better one:

Option 1: replacing the GTX 580 with 2x R9 290x, which could not only give me 4K connectivity via Display Port 1.2 but also probably increase preview speeds a little (Q: how much?)

Option 2: adding the cheap K420 Quadro card with the same 4k connectivity (i.e. DP 1.2) with a 4K monitor, and living without full preview speeds (or working with proxies)

Would appreciate everyone's opinion; cheers

Piotr

PS. It's obvious that Option 1 is considerably more expensive, so I'd only go with it should I get your confirmation that the preview speed gain would be high enough to justify the costs involved...

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

Wolfgang S. wrote on 3/14/2015, 6:03 AM
I still do not understand why you want to use the UHD-monitor with the DVI-I from the Quadro K420. Why not use the display port to connect the UHD-monitor with the quadro?

The quadro would be an asset if you wish to edit s3D with Vegas, and if you wish to use a monitor with 10bit. Given that you use footage from the AX100 what has 8bit only, maybe you may not want to go for 10bit anyway. So in your case - why to go for a quadro?

The R9 290x has the potential to allow you a 8bit preview in UHD using the display port 1.2. AND this card supports the actual Vegas version much better. So looking for good preview capabilities the R9 290x is maybe the better solution.

Desktop: PC AMD 3960X, 24x3,8 Mhz * RTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

megabit wrote on 3/14/2015, 7:46 AM
Thanks Wolfgang - but you misunderstood my English :) It's exactly the Display Port 1.2 connection with its 4k@60 Hz capability that's the reason for considering the Option 2 (just adding a cheap Quadro K420). But since I don't think it would be faster in accelerating Vegas that the GTX 580 I have, I only treat it as a means of actually connecting a 4K monitor which my GTX 580 can NOT drive at 4K.

So this option being (hopefully) explained now, how about the other Option - replacing the GTX 580 with (perhaps 2) R9s - but it being substantially more expensive, by how much do you think it would increase my current 18fps at Best/Full I'm getting out of the 580 now (with XAVC S 25p)?

Piotr

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

Wolfgang S. wrote on 3/14/2015, 8:57 AM
I am not sure if I understand you right.

As far as I understand the specification of the Quadro K420, you have following situation:

"CUDA Cores 192
Graphics APIs Shader Model 5.0, OpenGL 4.4, DirectX 11"

http://www.pny.com/nvidia_quadro_k420

I have here the older quadro 2000D - also with 192 cuda cores and also with Open GL 4.2. I can tell you that my 2000D does not support the playback of Vegas really in a great way. So I use also a GTX570 in my system - and that is fine.

So, I would expect that you will not see a huge support for the preview capabilities with the K420. The only reason to take that card would be to have the UHD resolution.

"Maximum DP 1.2 Resolution 3840 x 2160 at 60Hz (direct connect)
Maximum DVI-I DL Resolution 2560 x 1600 at 60Hz
Maximum DVI-I SL Resolution 1920 x 1200 at 60Hz
Maximum VGA Resolution 2048 x 1536 at 85Hz"

And for the resolution UHD will work only when you use the dp 1.2.


The other option: For the preview reasons - and if I need only 8bit - O would go for the R9 290x. Here you have the UHD resolution but also a support for the preview.

Desktop: PC AMD 3960X, 24x3,8 Mhz * RTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

OldSmoke wrote on 3/14/2015, 8:59 AM
Piotr

Just a quick in because I am typing on my phone. Only the socket 2011 and higher or 1360 will give you two full PCIEx16 slots. On a system like yours, putting In a second card will reduce both slots to PCIEx8.

Aside from that, yes the R9 290/290X will improve your timeline performance quite a bit. Nevertheless, working myself with native AX100 files at 30p on a OC 3930K and SSDs, it still not smooth. It works ok with straight cuts but a simple cross fade and the fps drops. As such, I use proxies which isn't bad at all and something you might want to consider. I use two 27" Dell Ultra 2560x1440 monitors and that works really well. I would spend more money on a good montior, one with good color representation, and worry too much about preview in 4K; apt least not now.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

megabit wrote on 3/14/2015, 9:41 AM
This is a very sound advise, OldSmode - thanks. A proper 4K monitor will stay for longer, while my PC will need replacement sooner than later, anyway.

So if only my GTX 580 offered some 4k monitor connectivity, I'd leave the PC "as is" for a while and use proxies to achieve better preview experience. But since it doesn't, I need to replace it with some newer generation card, anyway - and here your making me aware that an additional card will cause both the fall back to only 8x PCIE speed is the most important thing - thanks again.

I guess I made my decision: will buy the cheapest Quadro with DP 1.2 (which happens to be the K420) and use proxies, as it's going to be even slower with VP preview than my current GTX 580 - and the latter is going on sale NOW. So no GPU addition, but replacement is the way to go with an old system like my PC...

So my friends: if anyone is interested in buying a perfect Gigabyte GTX 580 with 3 GB VRAM (GV-N580UD-3GI) - please pm me.

Piotr

PS. Sometimes chiming-in even using a smartphone can really help a mate on a great Forum like this :)

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

OldSmoke wrote on 3/14/2015, 10:09 AM
Piotr

This doesn't make any sense. The K420 will not be able to handle 4K preview from the timeline nor will it help with any GPU acceleration. You end up working with low resolution proxies and have to preview that; which isn't 4K.

The only way you might get 4K out of the PC is playing back the 4K final rendered file with VLC or a player that can handle it and output to the TV.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

megabit wrote on 3/14/2015, 10:16 AM
I'm aware of all you said, mate.

But for final 4K playback of my edit results (e.g. from VLC you mention) I do need some means of hooking up a 4K monitor, right?

That I'm left with proxies for my edits in Vegas - until I'm in a position to purchase a new killer PC - I sort of settled down with already :)

So it all makes perfect sense, even if it is one heck of a compromise for the time being!

Piotr

PS. Unless you have a better idea of course... I'm all ears, but - planning to spend cash on the cheapest yet competent 4k camera the AX100 is, plus (not necessarily the cheapest) 4k monitor - I cannot upgrade my editing system at the moment :(

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

megabit wrote on 3/14/2015, 10:53 AM
Oh yes - theres is a better (though more expensive) option: replace the GTX 580 with something with DP 1.2 but better Vegas acceleration capability than the K420...

And here the Wolfgang's suggestion of the R9 290x comes to mind naturally, but - does it make sense to invest in a card significantly more expensive than the K420, which may become obsolete by the time I'm ready for a completely new system? If I could test one in my the current system, and assess how much faster it is than the current GTX 580... But I don't have such an opportunity, so again - a question to those who actually tried many cards with VP 13:

- what is the most probable, expected fps increase with the 290x vs 580, using XAVC S codec as in AX100 4k 25p files? Perhaps it would be worthwhile, after all...

Piotr

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

OldSmoke wrote on 3/14/2015, 11:40 AM
Piotr

I used 2x GTX580 with the AX100 files before I switched to the current 2x R9 290. The difference is noticeable even in normal preview with no FX applied but even more so with FX. The R9 290 will serve you well for some time and you will appreciate the additional performance when it comes to 3rd party plug ins which mostly use OpenCL.

You may want to consider buying a used card off eBay. There plenty to choose from, I went for the Asus Radeon R9 290 4GB card because it is lower cost. I then upgrade them to watercooled because I like my PC to be as quiet as possible.

You may also consider keeping the GTX580 and rather invest into a newer 6 core CPU since you will be working with proxies anyways the 580 will be sufficient; even an older 3930K or 4930K will serve well.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

Wolfgang S. wrote on 3/14/2015, 11:55 AM
Vegas still takes most of the preview performance from the cpu. From this side a better cpu will add a lot of performance. The gpu will help for the preview of effects like color corrections. So a better cpu will help to improve the situation.

I would take the quadro for a 10bit preview or 3D mainly. But not for the preview reasons.

Desktop: PC AMD 3960X, 24x3,8 Mhz * RTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

megabit wrote on 3/14/2015, 12:07 PM
Guys - my main problem with the current system is that with the LGA1155 on the mobo, I'm more or less stuck as far as CPU upgrade is considered. Hence the need for the entire system upgrade.

I must look up the mobo (not at the system at the moment, so cannot just read the model name from HWInfo) whether or not it indeed falls down to 8x PCIe speed with 2 16x graphics card - but if it does, I'm even more hopelessly hosed :(

So the more I think of it the less options I see. OldSmoke - which way are you using your 2x graphic cards? Crossfire, I guess - so that Vegas sees a single GPU to use?

Piotr

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

OldSmoke wrote on 3/14/2015, 10:05 PM
No, I don't use it in Crossfire nor did I use the Nvidia cards in SLI; Vegas is not designed for it. And you can trust when I say that your MB can not handle 2x PCIex16 because the CPUs for that MB can't do it; they don't have enough PCIe lanes. Only socket 2011, 2011-V3 and maybe 1360 have sufficient PCIe lanes. And don't buy a 5820 either, it is again limited.

Vegas uses OpenCL for timeline preview and a faster CPU only will not help; been there done that.

Proper 4K systems are still 6-9month away; IMHO.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

megabit wrote on 3/15/2015, 3:16 AM
Yes you are correct about my mobo.

If you don't connect your graphics cards, how do you make Vegas use both? This is something I don't understand - you can only pick one of them in the Preferences?

Piotr

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)