Why is PC #1 rendering faster than PC #2?

newmediarules schrieb am 27.04.2010 um 16:55 Uhr
Okay, in the right corner I have this:

Intel Core 2 Duo CPU
E8200 @ 2.66ghz
1.97 ghz, 3.25/RAM
XP
V8.0c

And in the left corner, I have this:

AMD Athlon 64 X2
Dual Core 6400+
3.21ghz, 4GB/RAM
Win7 64-bit
V8.1


I did a side-by-side comparison test render using the exact same Vegas time-line and data. The Intel render was maybe 10% faster.

I was told that the AMD render would blow away the Intel. What am I missing?

As usual, thanks in advance for any info you can provide.

Kommentare

TheHappyFriar schrieb am 27.04.2010 um 17:03 Uhr
the Core 2's are better then the X2's. They're 2nd generation multi-core CPU's & the X2's are first generation. A better comparison would be a P4 Duel Core or a Core Duo (not Core 2).

At the time of release though, the X2's were a LOT cheaper then the Core 2's. $/second, the AMD's were more efficient. I'd say they still are, but some times the seconds matter.
John_Cline schrieb am 27.04.2010 um 18:56 Uhr
"I was told that the AMD render would blow away the Intel."

Someone either lied to you or didn't know what they were talking about. AMD had the speed advantage for about three weeks five or six years ago, since then Intel has cleaned their clock (pun intended.)

The processor wars are over and Intel won by a knockout. AMD now competes with Intel on the budget end of the market and only on price.
A. Grandt schrieb am 27.04.2010 um 19:17 Uhr
Actually, when the Athlon CPU's first made an appearance, it took Intel quite a while to catch up, and it wasn't till they released the Core2 series they caught up to AMD, and AMD have been catching up ever since. Until then they had been drumming that clock is everything in the P4 series, but it just didn't measure up. It wasn't till Intel released the Centrino series that they noticed that their old PIII technology (which Centrino was built on) were able to outperform the P4, and that made them rethink their approach entirely, and they seemingly scrapped their deep execution pipes over night.

However AMD have a valid selling point, because if you calculate the "Bang for the buck" AMD is the clear winner, but if you have a decent budget and need the fastest Windows PC you can possible get, you have to go for Intel.
If I understand it right the Intel chips have a few things that when added up, makes for a very strong performer. Generally they have more cache built in, and it's faster to access. The cache controller is apparently far better than AMD's, as is the chips branch predictor. Or at least they were, I havent' looked it up lately.
Also the Hyper Threading helps keep the chip busy. Back when HT were first introduced it was a bit of a disappointment, to put it mildly, but today where more and more applications are multi-threaded and able to use multiple cores, that little detail is adding quite a bit.

Don't count AMD out yet, they may not make the fastest chips at the moment, but they do have their place as you get a lot of power for less, and I'm quite interested in seeing real life comparisons with their new 12 core Opteron and 6 core Phenom II's against Intel.
Ehemaliger User schrieb am 27.04.2010 um 19:43 Uhr
It has a lot to do with how the CPUs handle data (and how many data streams they can crunch at once - the ultimate over-simplification of the issue). I have two i7's (desktops) and an i5 (laptop). They really get the job done (ASUS motherboards that are built to overclock - though I'm pretty conservative that way).

Don't worry too much. By the time you need to do your next upgrade everyone (hopefully) will be coding software to use the excess power locked away in GPUs. Once CPUs and GPUs starting working together, we'll have a major leap forward in desktop production (it's slowly starting to happen, but it's going to require major recodes for a lot of aps).

Whenever I have the feeling that my hardware is letting me down, I just turn the wayback machine to 2007 and realize that I have 4x more power now as I did then. And even then I was doing okay. :-)
A. Grandt schrieb am 27.04.2010 um 21:20 Uhr
10 years ago, I saw a display at a hardware store. They had 2 machines in the window, costing the same. Thing is, one was a 1990 model, and it's price tag was from 1990, the other was brand new, at the time. Even then it really stunned me now primitive PC's were in 1990. I guess a similar display today would knock my socks off.
Thing is, back then you would feel like you were the king of the world with a PC like that on your desk, now my cell phone will be considered a relic if it had that little computing power.
jazzmaster schrieb am 05.05.2010 um 21:11 Uhr
I remember when a 100 Mhz computer was the top of the line!! Ran it until 1998! Since I had one of the first Video Toasters (1990) I also had an Amiga and have been missing the Amiga ever since it went bust. It internally synced with a video signal. PC's coundn't touch it.

I remember when a Hollywood Editing Studio called me up one day and said, Hey, you spent $50,000 with us last year and nothin' this year, what gives? I told them I had a Video Toaster and could do a page peel on my desktop. He replied, and it probably looked just as good as ourse. All those Hollywood studios did either a big change-over or went out of business, thanks to the Video Toaster.
Steve Mann schrieb am 06.05.2010 um 04:24 Uhr
"I remember when a 100 Mhz computer was the top of the line!!"

I must be old. I remember when the 8-bit PC ran a whopping 4.77 MHz. The PC-AT upped that to 6 MHz, and overclocking then was replacing the clock crystal with an 8 MHz one.