Some 4k Vegas timeline fps with new Dell XPS 15

Peter Siamidis wrote on 12/23/2015, 12:20 PM
Hey guys, I just sold my Macbook Pro and switched to the new Dell XPS 15 9550 model with the 4k display. My old Macbook struggled a bit with 4k content even though it was a top of the line model because it still had an old 750m gpu in it, the new dell has both an Intel 530 and NVidia 960m in it so I've been testing out timeline performance on Vegas Pro 13. Bootcamp wouldn't let me use both the Intel and NVidia gpu's on Windows on my old Macbook because Apple won't support that, but both are supported on the Dell so I was able to test both easily since they can each be selected right from Vegas Pro. Some timeline fps are below. Note that Filmconvert does use the gpu and is very processing intensive. Cpu is a quad core Skylake i7.

1)
One 30fps 100mbs 4k video from a Sony AX100 camera.
Vegas preview window set at BEST/FULL.
Raw footage.
Intel 530: 30fps, ~75% cpu
NVidia 960m: 30fps, ~75% cpu


2)
One 30fps 100mbs 4k video from a Sony AX100 camera.
Vegas preview window set at BEST/FULL.
Added watermark and Filmconvert 2.0 effects.
Intel 530: 12fps, ~75% cpu
NVidia 960m 13fps, ~75% cpu


3)
One 30fps 100mbs 4k video from a Sony AX100 camera.
Vegas preview window set at BEST/HALF.
Added watermark and Filmconvert 2.0 effects.
Intel 530: 23fps, ~75% cpu
NVidia 960m 29fps, ~75% cpu


4)
One 30fps 100mbs 4k video from a Sony AX100 camera filling the entire screen.
One 30fps 50mbps 1080p video from a Sony A7s camera resized to fit the top left corner.
Vegas preview window set at BEST/FULL.
Added pan/crop, color corrector and sharpen effects to 4k video.
Added pan/crop, Filmconvert 2.0 and sharpen effects to 1080p video.
Intel 530: 3.5fps, 75-85% cpu
NVidia 960m 7fps, 85-92% cpu


5)
One 30fps 100mbs 4k video from a Sony AX100 camera filling the entire screen.
One 30fps 50mbps 1080p video from a Sony A7s camera resized to fit the top left corner.
Vegas preview window set at BEST/HALF.
Added pan/crop, color corrector and sharpen effects to 4k video.
Added pan/crop, Filmconvert 2.0 and sharpen effects to 1080p video.
Intel 530: 12fps, 85-90% cpu
NVidia 960m 15.5fps, 97% cpu


6)
One 30fps 100mbs 4k video from a Sony AX100 camera filling the entire screen.
One 30fps 50mbps 1080p video from a Sony A7s camera resized to fit the top left corner.
Vegas preview window set at BEST/FULL.
Raw footage.
Intel 530: 8.5fps, 85% cpu
NVidia 960m 12fps, 95-100% cpu


7)
One 30fps 100mbs 4k video from a Sony AX100 camera filling the entire screen.
One 30fps 50mbps 1080p video from a Sony A7s camera resized to fit the top left corner.
Vegas preview window set at BEST/HALF.
Raw footage.
Intel 530: 18fps, 90% cpu
NVidia 960m 21.5fps, 100% cpu


Overall I'm happy with the results since I tend to work with a single 4k video clip in my footage, so setting Vegas at BEST/HALF let's me just about view the footage at full framerate with all my typical effects added. Hopefully these numbers are useful to those wondering if 4k on a laptop is feasible.

Comments

astar wrote on 12/23/2015, 3:40 PM
You do not really specify what your codec is. Assuming directly from the ax100 you mean XAVC-S content? LGOP formats have an overhead to work through, I would run the same tests with XAVC-I, or Cineform intermediate conversions.

4K consumer compression settings have more to do with product line up placement, and using low cost, low bandwidth storage media. The high compression not only means a reduced image quality, but a large overhead in decompressing when attempting to edit that material. The same is actually true for AVCHD material at any bitrate, its just that processing power has increased to the point of not noticing it as much. Digital Intermediate like Cineform, HDCAM, XAVC-I, or XDCAM is the best way to edit on the timeline.

The 980m with 4GB vram would be the only NV mobile chipset to get, even then the mobile chip is pretty weak. The 960M is basically HD5770 class hardware with 4K abilities.

It would be interesting to see the stats run at Preview/Half. I rarely run BEST/FULL while editing, and normally only pop into BEST/FULL to set effect levels that require seeing the result. High frame rate I feel is more important than WYSIWYG in my editing style.
Peter Siamidis wrote on 12/23/2015, 7:06 PM
Yeah all XAVC-S. I won't use intermediate codecs, takes way too much time, too much storage, a huge pain in the ass to archive, pain in the ass to revisit the material, etc, they are just a collosal hassle, I won't touch them but to each their own. Likewise I never use "preview' for the timeline, always "best", because I take screen grabs from my videos for image galleries on my websites and those look terrible in preview mode. It's not worth toggling back and forth because ultimately I'd forget and end up with a set of useless screen grabs that I would just have to go back and redo and waste more time. Everyone has their workflow and that's mine, your mileage may vary as they say. I just showed numbers for how I work in case they are of interest to others. Good news is that laptops have come a long way and it's pretty easy to get good 4k timeline performance now. I do wonder how the Iris version of Intel's Skylake gpu would perform, but I don't have one of those to test alas.