Hi everybody. I've been a Vegas user for a few months now, and I'm opening up a video-editing lab where people can come in and do their own editing. We're building a bunch of high-end machines, and I decided to put the old AMD vs. Intel argument to rest once and for all (in my own mind). I set up a machine with an Intel Pentium 4 3.0 GHz CPU w/hyperthreading, 1 GB DDR400 in a dual-channel setup. I had another machine, this one an AMD Athlon XP 3000+ with 1 GB DDR333 in a dual-channel board. I then installed Vegas 4 on both machines, updated them to the most recent versions, and downloaded the render test from Sundance Media Group. In repetetive tests with the render test, the Pentium was the clear winner - by a margin of around 30%. This was amazing to me, since I had no idea the difference would be so huge. Then I decided to go with a real-world test. Several photos imported, panned, cropped, and transitioned with page peels and page loops, and rendered using the NTSC DV template to an AVI file. This is more in line with what we'll be using these machines for on a daily basis. I ran the test on these two machines, and the AMD came out the clear winner. I ran them several more times to make sure, then ran them on other CPUs as well. Here are the results I came up with. The render times are in seconds:
Athlon XP 3000+: 169
Athlon XP 2400+: 205
Intel P4 3.0 GHz HT: 208
Intel P4 3.0 GHz: 219
Athlon XP 2100+: 230
Intel P4 2.5 GHz HT: 253
Intel P4 2.5 GHz: 265
Intel P4 2.4 GHz: 287
It seems that, at least in our particular application, the AMD chips will be better at handling rendering of these kinds of projects. Now the clincher: can anyone tell me why the render test from Sundance Media would have such strikingly different results in our tests than a test we set up ourselves? What exactly does it do to tax these CPU's so much differently? Are there any settings I should be adjusting in the Intel machines to make them fare better in these real-world rendering tests? Bear in mind, when I approached this project, my sole intent was to decide which chips would be better to use in our labs. I have no inherent preference for either chip. I am honestly seeking other opinions before I commit several thousand more dollars to purchasing machines for the lab. Can anyone shed some light on this subject for me? Any help would be appreciated. You can respond here, or email me at dustin@videoscrapbookstudio.com.
Thanks,
Dustin
Athlon XP 3000+: 169
Athlon XP 2400+: 205
Intel P4 3.0 GHz HT: 208
Intel P4 3.0 GHz: 219
Athlon XP 2100+: 230
Intel P4 2.5 GHz HT: 253
Intel P4 2.5 GHz: 265
Intel P4 2.4 GHz: 287
It seems that, at least in our particular application, the AMD chips will be better at handling rendering of these kinds of projects. Now the clincher: can anyone tell me why the render test from Sundance Media would have such strikingly different results in our tests than a test we set up ourselves? What exactly does it do to tax these CPU's so much differently? Are there any settings I should be adjusting in the Intel machines to make them fare better in these real-world rendering tests? Bear in mind, when I approached this project, my sole intent was to decide which chips would be better to use in our labs. I have no inherent preference for either chip. I am honestly seeking other opinions before I commit several thousand more dollars to purchasing machines for the lab. Can anyone shed some light on this subject for me? Any help would be appreciated. You can respond here, or email me at dustin@videoscrapbookstudio.com.
Thanks,
Dustin