Upgrading the editing system for 4k?

megabit schrieb am 16.03.2015 um 12:25 Uhr
I posted recently about my attempts to edit low-end 4k (XAVC-S 25p only from the AX100 camera) on my i7-based system. As it seems dubious, I though that - when I manage to sell all my HD shooting rig I can't use any more due to spine problems - I'll be able to upgrade my system. So I'd like to seek your advise guys whether it makes sense to go all they way for a 2011-3 based system, 8-core i7 and DDR4 RAM, or spend half of that on the Xeon 12-core CPU on 2011 mobo (DDR 3 RAM I'd have from my current system)?

I've always been buying only the best stuff, but with my current situation it's not that easy anymore - and I'd like to go somewhat future proof, as well. So, what kind of 4K editing would the 8-core i7-5960X be enough for, and how far I could stretch the 12-core Xeon on the 2011 platform? For instance, would it still be good for XAVC 420 10 bit in a few years time?

Of course both would be accelerated with some nice R9 AMD GPU.

Piotr

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

Kommentare

Steve Grisetti schrieb am 16.03.2015 um 14:46 Uhr
When it comes to technology hardware, I always say that there's no such thing as future-proofing -- because pretty much anything you buy today is going to cost half as much in six months.

XAVC-S should be editable in version 13 on pretty much any current quad core i7. I sure wouldn't got nuts spending a fortune on cutting edge technology. All you'll do is break your heart a year from now when you see that same hardware selling at Tiger Direct from under $1000 when you spent $3000 for it.

You really can't beat this game by rushing to the market, in my not so humble opinion. So unless you've got really deep pockets, I'd recommend buying the second-best technology out there -- not that hottest ticket.

Again, just my two cents.

This benchmark chart has been tremendously helpful to me.
http://www.cpubenchmark.net/high_end_cpus.html

I can't imagine spending $4500 on a processor that will be selling for half that in a year -- and a quarter than in two!
megabit schrieb am 16.03.2015 um 14:55 Uhr
You're generally right, but the thing is I do have very good bargains (especially for the 12-core Xeon of the second latest generation)... Unfortunately it's not present in the chart you linked to (E5-2696v2 for $2000).

PS Let me put it this way: what will Vegas like more for previewing speed (MUCH more important to me than rendering speed): 8 cores @ 3.0 GHz in the latest & greatest 2011-3 X99 mobo, or 12 cores at 2.5 GHz in the previous generation system?

PSS Oh, and there is plenty of Future-proofness with newest mobos - There will be (in the case of LGA2011 and Xeon - already are) plenty of CPU upgrade possibilities. While with my current LGA1155 I am in a one-way-street already, the i7-4600K being the second fastest processor that it supports - plus I cannot even add another graphics card without the mobo's PCIe speed falling down to x8...

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

OldSmoke schrieb am 16.03.2015 um 17:02 Uhr
Read this threads:

http://www.sonycreativesoftware.com/forums/ShowMessage.asp?ForumID=4&MessageID=915077

and this

http://www.sonycreativesoftware.com/forums/ShowMessage.asp?ForumID=4&MessageID=910003

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

megabit schrieb am 16.03.2015 um 18:02 Uhr
Thanks OldSmoke, so it seems there is a consensus on the X99 with the i7 5960 being preferable over the lower frequency, even though more core, Xenons...Of course I understand getting 2 of those 16-core monsters would change the picture, but a decent 5960 system is already expensive enough for me to think twice.... Also strange is the very low fps got by one of the posters with the 5960 and the GTX580 card - could it be the card/system "generation mismatch"?

Piotr

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

OldSmoke schrieb am 16.03.2015 um 18:18 Uhr
Piotr

I am also on the fence with this. 5960X is an expensive system by itself and coming from a well working 3930K 6-core I am not sure if the extra money is worth it. I also haven't found anyone confirming that a 12 or 16 core Xeon at a lower clock is faster; I personally doubt it. I don't even mind spending the money on a great dual Xeon system if that get's me what I want. What I want is simple, native XAVC-S 4K 30fps files preview at Best/Full at 1080, 2160 and eventually 4K (when ever that will be).
The current 50Mbps/29.97fps files from my AX100 play "fine" on there own but a simple crossfade and the fps drops down. Also a simple cut will bring the fps down for a second or so until it builds up again; that is running all files from a RAID-0 with SSDs. I just upgraded the camera to 100Mbps 4K but I need to wait for the U3 class SDXC card to arrive. I hope the 100Mbps files have sufficient less compression to play better.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

megabit schrieb am 16.03.2015 um 18:49 Uhr
Looks like we're on the same boat, my Friend - except that your dilemma is even more difficult as your current CPU is much faster than mine; otherwise our goals are the same...

On the other hand, my decision gets more complicated than it could sound because I still have no idea which GPU card I should aim for. The reason being that for some reasons (long story), I'd like to use one of those early QHDTVs (like the 50" Samsung or alike) as my preview monitor. Unfortunately, they all only use HDMI (no DisplayPort), so to watch at full 4k@60 Hz with a direct HDMI link, the card should be capable of HDMI 2.0 connection - this is currently available on nVidia cards for sure (they stress it in the specs), but I'm unable to find a reliable information about the AMD (particularly the R9 which I'd like to have). All they mention about output resolution is Display Port 1.2 being of course 4k@60 Hz, but are vague about HDMI.... Yes it is said to provide 4k as well (with "deep color", whatever that means) - but it as well might turn to be the HDMI 1.4a standard which can only display 4k @ 30 (24) Hz and a reduced color resolution (so even though the colors might be "deep" - I guess it's a levels thing - but only 4:2:0 in resolution). Of course the last bit is OK for XAVC S which is 4:2:0 anyway, but the refresh frequency worries me - do you happen to know whether the R9''s HDMI is 2.0 or only 1.4- compatible?

Thanks,

Piotr

PS. I said 4:2:0 of color resolution would not be a problem, but who knows - perhaps I would like to switch to 10 bit 422 XAVC soon? Anyway - the graphics card with HDMI 2.0 compatibility seems like a must for me - all because I'm going to use a QHDTV instead of a PC 4k monitor ( in which case Display Port would solve the problem).

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

OldSmoke schrieb am 16.03.2015 um 19:10 Uhr
Piotr

No I don't know if the R9 has HDMI 2.0, not sure how to find that out. I do know however, the a ASUS Radeon R9 is different from a Asus R9; I guess same applies for other brands.

Also, don't mix up refresh rate with frame rate, they are different. Early and even current 4K TVs may have a refresh rate of 60Hz and higher but the 4K frame rate is mostly only 30(29.97) or 25p for PAL; 60(59.94) or 50p 4K TVs are expensive.

Edit: Read this!

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

megabit schrieb am 16.03.2015 um 21:29 Uhr
What makes you say I'm missing up refresh rate with fps? Far from that :)

You're a very nice and knowledgeable person, but please try to be jsut a little less patronizing...

Thank you,

Piotr

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

OldSmoke schrieb am 16.03.2015 um 22:49 Uhr
Piotr
Hz referes to refresh rate and not frames per second, two very different things.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

megabit schrieb am 17.03.2015 um 11:54 Uhr
It seems like you haven't even noticed my post - good for you :)

OK, so you should know that - if a signal can only be as low as with HDMI 1.4 (i.e. 24 Hz or 30 Hz refresh rate, as there is simply not enough bandwidth for a higher refresh rate), AND you fill happens to be the same (24 of 30 fps, respectively) - whether what you get on the screen is watchable to your eyes depends on the display type. LED monitors usually don't flicker even at such a low refresh rates, but try to watch plasma at say 24 Hz - the eyestrain will be unbearable!

So, how it is possible to watch 24 fps (or 30 fps in the NTSC, or 25p in the PAL areas)? Well - simply by refreshing the screen at an integer multiple of the frame rate. So for example in PAL area where I live, (HD)TVs - usually refresh at 25, 50, 100, 150, 200 ... and up to 1000 Hz sometimes (in NTSC area it would be 30, 60, 120 and so on Hz). This way - even on flicker -prone plasmas - you're getting a rock-steady picture even at a low fps (like 200 Hz refresh rate screen displaying a 25 fps progressive video).

The problem with HDMI lower the 2,0 is that such multiplies of the fps rates are not available, an you must watch a 25 fps movie with the 25 Hz refresh rate; a 50p movie you cannot display at all.

This all changes with display Port 1.2 and/or HDMI 2.0.

Do you understand now? Fine!

Piotr

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

farss schrieb am 17.03.2015 um 13:33 Uhr
[I]" This all changes with display Port 1.2 and/or HDMI 2.0."[/I]

True but where to get a HDTV that actually will accept a HDMI 2.0 signal at 50 fps?

From another forum yes, one can get a pretty cheap UHDTV that claims to have HDMI 2.0 but only up to 30fps at 4:2:2, at 60fps you have to drop down to 4:2:0 and according to those who've bought these TVs the drop to 4:2:0 is very noticeable and that's just from the unwashed masses, not the pixel peepers.

Now that I can borrow a F5 with an AXS R5 recorder and 1TB of media what to do with all this data, how to process it and how to preview it in a way that we can actually see the difference? I've seen it being done at the F5/55 product launch here in Sydney but it wasn't using Vegas and it was on a top shelf HP "Z" series PC with everything maxed out and then some. Fed into very large and very expensive OLED displays it was mind blowing to look at. All in all that's a lot of very expensive kit and a very old school workflow of editing proxies and then conforming and grading on dedicated systems.

Fast forward to today and yeah, the F5 can be had for not much more than I paid for my EX1 but there hasn't been a matching price drop for the post production kit required to justify all this 4K RAW or even XAVC goodness.

Bob.
megabit schrieb am 17.03.2015 um 14:53 Uhr
Bob,

You are right, but if you care to look up the information I posted about my needs and plans, I'm not going to deal with anything above XAVC-S, which is 25 fps and 4:2:0, anyway. So using a GeForce 7-series and up (all of which provide HDMI 2.0 along Display Port), I could feed any QHDTV via HDMI 2.0 while keeping the refresh rate at 50 Hz (which is enough refresh rate to avoid flicker - on a LED panel, anyway). So far so good - but the SCS own benchmarks show AMD is much more effective at timeline acceleration with FXes.... BUTAMD do not offer HDMI 2.0 - hence my problem (yes they do offer Display Port 1.2 just like GeForce do, so there would be no problem on a 4K PC monitor all of which support it, but for some reason I'd prefer a QHDTV over a monitor.)..

I hope I've been clear enough this time :)

Piotr

PS. Of course, if I say that "I'm not going to deal with anything above XAVC-S, which is 25 fps and 4:2:0, anyway", I only mean the time being - i.e. on my current system with just the graphics replaced. My GTX 580 is quite effective in timeline acceleration, but has no 4k output whatsoever...

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

HyperMedia schrieb am 17.03.2015 um 17:54 Uhr
Just get an iMac with Retina 5K display. Then run Bootcamp, Parallels or VMware Fusion.
You should be okay for about 4 to 5 years.

I’m running a 2012 iMac editing Full HD 4:2:2 50MB and up. With some 4k footages without any hiccups.

We are producing Broadcast International TV series. So far competed 71 episodes going on the fourth season.
wwjd schrieb am 17.03.2015 um 18:15 Uhr
I'm not tracking all the intrinsic details in this thread but seems like reaching the end goal is much harder than it should be here. I checked my card, bought a 4K monitor, and was up in running in a day or so.

Is this for a business with demanding specifics?
Rich Parry schrieb am 17.03.2015 um 18:38 Uhr
If it helps, I have a dual 12 core Xeon 2.26MHz system giving me 24 cores. It does nothing for timeline/preview window playback performance which in my case is extremely slow, even with simple HD video.

I've been disappointed in my system. Not sure where the bottleneck is.

CPU Intel i9-13900K Raptor Lake

Heat Sink Noctua  NH-D15 chromas, Black

MB ASUS ProArt Z790 Creator WiFi

OS Drive Samsung 990 PRO  NVME M.2 SSD 1TB

Data Drive Samsung 870 EVO SATA 4TB

Backup Drive Samsung 870 EVO SATA 4TB

RAM Corsair Vengeance DDR5 64GB

GPU ASUS NVDIA GeForce GTX 1080 Ti

Case Fractal Torrent Black E-ATX

PSU Corsair HX1000i 80 Plus Platinum

OS MicroSoft Windows 11 Pro

Rich in San Diego, CA

john_dennis schrieb am 17.03.2015 um 19:24 Uhr
"[I]Not sure where the bottleneck is.[/I]"

"[I]12 core Xeon 2.26MHz[/I]"

Your core clocks could be much higher.
VideoFreq schrieb am 17.03.2015 um 22:12 Uhr
I didn't have the time to read all the replies so I don't know if this was addressed yet. The major difference between any standard intel processor and a ZEON? A ZEON chip runs at 100% capacity all the time. A multi-core "I" chip is an algorithmically demand based, software driven product. Your load indicator might say its running 100% but that's an RMS value - and it takes time to ramp up to speed.

The ZEON was designed for servers so that random handling of massive amounts of data packets would not slow down the decision making capability of the chip, affecting system speed.