Athlon X2 RenderTest Results

TheRhino wrote on 6/25/2005, 12:27 PM
ATHLON X2 REPORT. . .

I just upgraded one of my Workstations to an Athlon X2 4400 on a DFI NF3 Ultra-D. This setup still needs some tweaking and everything is at default/stock settings. [Windows XP Pro installed]

Vegas Video 6.0b Tests [fresh install, no tweaks, no preview window]
Original Rendertest = 0:40 in "best" mode AVI DV Codec
Original Rendertest = 0:25 in "good" mode AVI DV Codec
Original Rendertest = 0:41 conversion to mpeg2 "best" setting

Update: My Own Test: 16min AVI clips to mpeg2 - straight cuts, color correcting througout, contrast and brightness adjustments. [Think Wedding Video - no special effects, just good correction. . .]

My Athlon Mobile 2400 OC to 2.1 Ghz = 25 minutes
New X2 4400 stock speeds = 10 minutes

Today is a good day!

2nd Update: BetaTest_Render_Nu = 2:47:23 "good" AVI DV Codec,
like everyone else, things slowed down at the 80% mark. . .

[Note I updated these numbers after turning off the preview window and switching to different memory sticks so that dual channel would work. . .]

Workstation C with $600 USD of upgrades in April, 2021
--$360 11700K @ 5.0ghz
--$200 ASRock W480 Creator (onboard 10G net, TB3, etc.)
Borrowed from my 9900K until prices drop:
--32GB of G.Skill DDR4 3200 ($100 on Black Friday...)
Reused from same Tower Case that housed the Xeon:
--Used VEGA 56 GPU ($200 on eBay before mining craze...)
--Noctua Cooler, 750W PSU, OS SSD, LSI RAID Controller, SATAs, etc.

Performs VERY close to my overclocked 9900K (below), but at stock settings with no tweaking...

Workstation D with $1,350 USD of upgrades in April, 2019
--$500 9900K @ 5.0ghz
--$140 Corsair H150i liquid cooling with 360mm radiator (3 fans)
--$200 open box Asus Z390 WS (PLX chip manages 4/5 PCIe slots)
--$160 32GB of G.Skill DDR4 3000 (added another 32GB later...)
--$350 refurbished, but like-new Radeon Vega 64 LQ (liquid cooled)

Renders Vegas11 "Red Car Test" (AMD VCE) in 13s when clocked at 4.9 ghz
(note: BOTH onboard Intel & Vega64 show utilization during QSV & VCE renders...)

Source Video1 = 4TB RAID0--(2) 2TB M.2 on motherboard in RAID0
Source Video2 = 4TB RAID0--(2) 2TB M.2 (1) via U.2 adapter & (1) on separate PCIe card
Target Video1 = 32TB RAID0--(4) 8TB SATA hot-swap drives on PCIe RAID card with backups elsewhere

10G Network using used $30 Mellanox2 Adapters & Qnap QSW-M408-2C 10G Switch
Copy of Work Files, Source & Output Video, OS Images on QNAP 653b NAS with (6) 14TB WD RED
Blackmagic Decklink PCie card for capturing from tape, etc.
(2) internal BR Burners connected via USB 3.0 to SATA adapters
Old Cooler Master CM Stacker ATX case with (13) 5.25" front drive-bays holds & cools everything.

Workstations A & B are the 2 remaining 6-core 4.0ghz Xeon 5660 or I7 980x on Asus P6T6 motherboards.

$999 Walmart Evoo 17 Laptop with I7-9750H 6-core CPU, RTX 2060, (2) M.2 bays & (1) SSD bay...

Comments

Chanimal wrote on 6/25/2005, 1:21 PM
Hey folks,

Check out this awesome review for the X2 versus Opteron, Intel, Athlon FX-55.

http://techreport.com/reviews/2005q2/athlon64-x2/index.x?pg=1

It SMOKES everything in almost every single test!

This time, AMD X2 kill Intel on the Divx and the Windows Media render test.

Also, the 4200 with 512 smokes almost everything also and is just barely less performance than the 4800 with 1 meg (each processor). At $531 - $575 it looks like I'll be upgrading this weekend!

***************
Ted Finch
Chanimal.com

Windows 11 Pro, i9 (10850k - 20 logical cores), Corsair water-cooled, MSI Gaming Plus motherboard, 64 GB Corsair RAM, 4 Samsung Pro SSD drives (1 GB, 2 GB, 2 GB and 4 GB), AMD video Radeo RX 580, 4 Dell HD monitors.Canon 80d DSL camera with Rhode mic, Zoom H4 mic. Vegas Pro 21 Edit (user since Vegas 2.0), Camtasia (latest), JumpBacks, etc.

farss wrote on 6/25/2005, 5:30 PM
I think it was a DMN review that found running the Apple AE render test under Win64 it ran twice as fast, that's on dual dual core AMDs, not exactly a cheap box but but still a little cheaper than the G5 it was up against.
Under Win32 it was only 30% faster, can it be that even uStuff have got their act together?
TheHappyFriar wrote on 6/25/2005, 7:34 PM
whoohah... a screamer. I DID find it wierd though that the single CPU chips outperformed the Duel chips in the Doom 3 test... especially since Doom 3 supports multiple threads.

The optersons also seem really wattage efficient...duel duel core Opteron's use 1.5x as much power as a duel xeon but also use less when fully loaded the the duel xeon.
StormCrow wrote on 6/25/2005, 8:09 PM
Ok, I'm looking on Newegg and they don't even carry a X2 4400? Are you sure that is the chip?

They have the:

Athlon Processors

GlennChan wrote on 6/25/2005, 9:07 PM
stormcrow: Please edit your post so the link isn't so huge. You can use html tags.

For example:
<a href=http://www.newegg.com.orwhatever_your_link_is > Any text here </a >

will look like...
Any text here

The 4400+ doesn't seem to be on sale at newegg. but it is like the 4200+ but with twice the cache. Similarly, the 4800+ has twice the cache of the 4600+
TheRhino wrote on 6/25/2005, 9:34 PM
I purchased the OEM 4400+ on Tuesday night and it arrived on Friday. . . NewEgg changes their listings all of the time based on what they have in stock, etc. so you may not see it again until it is in stock.

Workstation C with $600 USD of upgrades in April, 2021
--$360 11700K @ 5.0ghz
--$200 ASRock W480 Creator (onboard 10G net, TB3, etc.)
Borrowed from my 9900K until prices drop:
--32GB of G.Skill DDR4 3200 ($100 on Black Friday...)
Reused from same Tower Case that housed the Xeon:
--Used VEGA 56 GPU ($200 on eBay before mining craze...)
--Noctua Cooler, 750W PSU, OS SSD, LSI RAID Controller, SATAs, etc.

Performs VERY close to my overclocked 9900K (below), but at stock settings with no tweaking...

Workstation D with $1,350 USD of upgrades in April, 2019
--$500 9900K @ 5.0ghz
--$140 Corsair H150i liquid cooling with 360mm radiator (3 fans)
--$200 open box Asus Z390 WS (PLX chip manages 4/5 PCIe slots)
--$160 32GB of G.Skill DDR4 3000 (added another 32GB later...)
--$350 refurbished, but like-new Radeon Vega 64 LQ (liquid cooled)

Renders Vegas11 "Red Car Test" (AMD VCE) in 13s when clocked at 4.9 ghz
(note: BOTH onboard Intel & Vega64 show utilization during QSV & VCE renders...)

Source Video1 = 4TB RAID0--(2) 2TB M.2 on motherboard in RAID0
Source Video2 = 4TB RAID0--(2) 2TB M.2 (1) via U.2 adapter & (1) on separate PCIe card
Target Video1 = 32TB RAID0--(4) 8TB SATA hot-swap drives on PCIe RAID card with backups elsewhere

10G Network using used $30 Mellanox2 Adapters & Qnap QSW-M408-2C 10G Switch
Copy of Work Files, Source & Output Video, OS Images on QNAP 653b NAS with (6) 14TB WD RED
Blackmagic Decklink PCie card for capturing from tape, etc.
(2) internal BR Burners connected via USB 3.0 to SATA adapters
Old Cooler Master CM Stacker ATX case with (13) 5.25" front drive-bays holds & cools everything.

Workstations A & B are the 2 remaining 6-core 4.0ghz Xeon 5660 or I7 980x on Asus P6T6 motherboards.

$999 Walmart Evoo 17 Laptop with I7-9750H 6-core CPU, RTX 2060, (2) M.2 bays & (1) SSD bay...

StormCrow wrote on 6/25/2005, 9:51 PM
Thanks Glenn...I never knew how to do that!

So from what you all are saying I'm guessing that Toledo would really fly! Is this faster than an Opteron top of the line?
GlennChan wrote on 6/25/2005, 11:48 PM
Doubling the cache makes a few to several percent difference in the majority of benchmarks out there. For Vegas 5 / rendertest.veg, a few folks have results for the FX and EE lines of processors (double the cache, many times the price). They are a few percent faster from what I remember. A step up in clock speed is typically better, as that always means ~6% improvement in performance (dividing clock speeds **among the same processor line** happens to be a very good guestimate of performance).

AMD X2 4800+ = 2X2.4ghz, 2X1MB cache
4600 = 2X 2.2ghz, 2X0.5MB cache
4400+ = 4800+ minus 200mhz clock speed

Opteron 275 = 2X 2.2ghz, 2X1MB cache (top of the line dual core; four cores total)
TheHappyFriar wrote on 6/26/2005, 6:43 AM
I looked at newegg & they have the 4200 ~$100 more then the MSRP the review stated. Is th ere a reason?
GlennChan wrote on 6/26/2005, 10:01 AM
Stormcrow, Spot has a dual core dual Opteron 275 system with a Tyan motherboard (not sure if it's THAT one).

Bug him to put his rendertest.veg (the old one) results up! :P
See the "holeee smokes!" thread here.
StormCrow wrote on 6/26/2005, 8:07 PM
Yeah Spot and I have had some talk about his system and it seems really fast. I was just wondering if it was as fast as the system being talked about here or faster. I want my next build to be smokin'
GlennChan wrote on 6/26/2005, 10:57 PM
I want my next build to be smokin'
You could always overclock it. ;P



Tongue in check aside:
A- Overclocking doesn't cause your computer to smoke.
B- Overclocking really does improve performance. Divide the clock speeds to guestimate performance. So yeah, your system would be smoking in that regard. Dual core dual opterons may not overclock well depending on the motherboard.
C- It does void your warranty. Overclocking typically does not damage anything in your computer. However, there are some reports of the "Northwood Death Syndrome", which happens to Northwood core Pentiums. Other processors all generally take a lot of abuse. If you don't do anything stupid with voltages, you should/will be fine. It's likely that more people kill their CPU when installing it however.
D- With overclocking, the biggest risk is instability. Be sure to check system stability with a program like prime95. Probably the best strategy is to overclock to the maximum and then knock your system down so you have safety/headroom.
TheHappyFriar wrote on 6/26/2005, 11:29 PM
a few other things to note:

*some MB's come with overclocking settings. And a nice auto-recovery system if it doesn't boot.

*overclocking can DECREASE performance if done improperly. A high CPU but decreased memory speed doesn't really help...

*increasing the CPU fsb to certain points can mess up AGP cards. Won't really know until you try though...

and, lastly, setup the emergency shutdown temp in your bios. that way you'll be sure you don't accidently fry the cpu.
GlennChan wrote on 6/27/2005, 8:46 PM
A few notes on TheHappyFriar's notes: ;)

Auto-recovery is definitely nice, but you can also reset the CMOS through moving the jumper.

A high CPU but decreased memory speed really does help.
My own tests, which change the memory divider to 5:4 and 3:2
Dropping the divider (to undreclock RAM) affects performance only a few percent.

You can certainly mess up the AGP cards. That's why you should choose a platform with a fixed AGP/PCI clock. The AMD64 systems when they came out didn't have that.

Emergency shutdown temp isn't really necessary, unless you screwed up the heatsink/fan installation. In which case you'd be in trouble anyways.
The latest processors will shutdown by themselves. The old Athlon XPs didn't do that, and it was possible to fry them if you didn't install the heatsink/fan properly. You wouldn't be able to even get into the BIOS to overclock that chip, because it would've fried first.
MohammeD T wrote on 6/27/2005, 9:39 PM
The Rhino , thanks for the Tests , i am also interested in the new X2 Athlons , would you please Give is more info on your build , and the part numbers of those parts , also what would you change if you had the chance to rebuild it again , it would be much much Appreciated .. thanks
TheHappyFriar wrote on 6/28/2005, 4:37 AM
you didn't have any difference rendering between RAM speeds? That's interesting! I did. :)

I found that when my V5 rendering time decreases drasticly when I OC my AMD 64 3000 from 1.8ghz to 2ghz, but when I get past the 2ghz mark my memory speed dropped from 200mhz to 166mhz, my rendering time only increase ~1-2 seconds between up till 2200mhz. That's 1-2 seconds rendering speed between 2ghz & 2.2ghz. Between 1.8ghz & 2ghz I get a 6-8 second performance decrease.

This shows up in Vegas, games, & other benchmarks. I only have single channel ram though. Maybe that makes a difference. My ram also isn't supper great eigther.
TheRhino wrote on 6/28/2005, 9:05 PM
Here are the contents of the system in the OP:

Upgraded Components: $850 Newegg
--Athlon X2 4400+
--DFI NF3 Ultra-D Motherboard
--Thermaltake XP120 Heatsink
--120mm fan

Existing Parts [this workstation]:
--High Point RocketRaid 100 [keeps data if you swap motherboards. . .REAL handy]
--Corsair TwinX1024 3200 Memory
--Matrox Parhelia 650 AGP video card
--Artic Silver Paste
--Cooler Master CM Stacker Case
--CM Crossflow Fan [air directly over motherboard for CM Stacker cases]
--CM Multi HD bay [big fan blowing air right over the RAID/system drives]
--NEC ND-3520? Dual Layer
--Two 160gb HDs on IDE RAID [didn’t pay to go bigger because it all has to be transferred to a single external drive. . .]
--One 160gb HD OS
--One 160gb on shelf with exact ghost OS image – ready to go. . .
--550W PS ??? One of the highly recommended expensive ones. . .

Multiple External USB & FW drives ranging from 200gb to 400gb.

What would I do different?
Add a PCI card with more firewire ports, the DFI m/b only has two and that is just enough to get by for now. . .
Stop worrying about how much money I spend annually on upgrades and treat myself to a 24” Dell LCD!

Workstation C with $600 USD of upgrades in April, 2021
--$360 11700K @ 5.0ghz
--$200 ASRock W480 Creator (onboard 10G net, TB3, etc.)
Borrowed from my 9900K until prices drop:
--32GB of G.Skill DDR4 3200 ($100 on Black Friday...)
Reused from same Tower Case that housed the Xeon:
--Used VEGA 56 GPU ($200 on eBay before mining craze...)
--Noctua Cooler, 750W PSU, OS SSD, LSI RAID Controller, SATAs, etc.

Performs VERY close to my overclocked 9900K (below), but at stock settings with no tweaking...

Workstation D with $1,350 USD of upgrades in April, 2019
--$500 9900K @ 5.0ghz
--$140 Corsair H150i liquid cooling with 360mm radiator (3 fans)
--$200 open box Asus Z390 WS (PLX chip manages 4/5 PCIe slots)
--$160 32GB of G.Skill DDR4 3000 (added another 32GB later...)
--$350 refurbished, but like-new Radeon Vega 64 LQ (liquid cooled)

Renders Vegas11 "Red Car Test" (AMD VCE) in 13s when clocked at 4.9 ghz
(note: BOTH onboard Intel & Vega64 show utilization during QSV & VCE renders...)

Source Video1 = 4TB RAID0--(2) 2TB M.2 on motherboard in RAID0
Source Video2 = 4TB RAID0--(2) 2TB M.2 (1) via U.2 adapter & (1) on separate PCIe card
Target Video1 = 32TB RAID0--(4) 8TB SATA hot-swap drives on PCIe RAID card with backups elsewhere

10G Network using used $30 Mellanox2 Adapters & Qnap QSW-M408-2C 10G Switch
Copy of Work Files, Source & Output Video, OS Images on QNAP 653b NAS with (6) 14TB WD RED
Blackmagic Decklink PCie card for capturing from tape, etc.
(2) internal BR Burners connected via USB 3.0 to SATA adapters
Old Cooler Master CM Stacker ATX case with (13) 5.25" front drive-bays holds & cools everything.

Workstations A & B are the 2 remaining 6-core 4.0ghz Xeon 5660 or I7 980x on Asus P6T6 motherboards.

$999 Walmart Evoo 17 Laptop with I7-9750H 6-core CPU, RTX 2060, (2) M.2 bays & (1) SSD bay...

jlafferty wrote on 6/29/2005, 10:36 PM
Forget the render tests -- how fast do these things launch the Media Manager? :D
MohammeD T wrote on 7/4/2005, 12:48 AM
Thanks Rhino , i would certainly take those parts in consedration
philfort wrote on 7/9/2005, 6:00 PM
I get 74 seconds with my X2 4400 (1024MB dual channel PC3200 RAM from OCZ) , using the render test here ("best" mode):
http://www.vasst.com/resource.aspx?id=35443070-0b67-4a2e-807c-a7f431ebd02d

In "good" mode, 50seconds.

I checked, and both CPUs are being used almost to the full extent. But the render time is almost double the result TheRhino is getting. What's up with that?
Even stranger, is when I specify that only 1 rendering thread be used, I get the same 74s render time, with 50% CPU usage (each core using half).

Then I tried an 8 minute DV clip, to mpeg2, with color correction and brightness/contrast. I got 5:38, which is more in line with your 10 minute time for a 16 minute clip.
philfort wrote on 7/10/2005, 7:05 AM
Hmm... if I open two instances of Vegas and specify only 1 render thread, and render simultaneously, they each render in about 80s.
philfort wrote on 7/11/2005, 12:55 AM
Ok, I finally figured this one out. If I set my RAM preview to a respectable number (100MB), I guess times comparable to theRhino's. I'm getting 39s for rendertest.veg in "best" mode.

My RAM preview was set to 16MB, which is the default. Not sure why that makes a difference - I guess the memory allocated to RAM preview is also used to determine how much memory is avaialbe during the rendering process, to be shared by the 2+ threads doing the rendering? If there isn't enough, maybe each thread needs to re-render the same piece of the project.
DJPadre wrote on 7/11/2005, 7:08 AM
i did not know that keeping teh preview window open would affect render times??

can someone confrim this please??