Comments

Bill Ravens wrote on 11/10/2004, 3:35 AM
I think you may have a small mis-understanding. 64 bit apps have a much deeper bit depth to the math thats done, yes 64 bit rather than 32. As such, video apps in particular, have much better color depth they can carry. While the number of calculations per cycle go up, so does the number of overall calculations, due to the increased bit depth. As a result, more calculations are done but the actual rendering time remains about the same. At least, this has been my experience with SGI and Sun UNIX boxes.
TheHappyFriar wrote on 11/10/2004, 8:25 AM
i think you're confusing color depth with processor type. 64-bit CPU'S have no effect on the colors somthing can process (a 32-bit intel & 64/128-bit SGI still display 24/32-bit color. But, theoritically, ou could have 64-bit color on a 64-bit machine at the same speed as 32-bit color on a 32-bit machine).

I think most "commerical" apps won't come out with 64-bit until Win64 comes out of Beta. As usual, the computer entertainment industry (aka games) will start doing this first. Infact they already have. There's a 64-bit version of Unreal Tournmant 2004 (Windows 9x/ME/2000/XP Linux). One is planned for Doom 3 (which already does Windows, Linux, & mac in 4 months).

See the pattern? Game were the first thing to use 3d cards & then other apps picked up on them (on the PC anyway). Games were used to get DirectX off the ground (and keep it off).
Gamers themselves test out all the new CPU's so that everyone else can get them cheap a couple months down the road (thank's to all those paranoid gamers out there who have to have the latest thing!)

:)
Bill Ravens wrote on 11/10/2004, 8:27 AM
Happy...

No, my wording may be incorrect, but the general idea is not. It's a great urban legend that 64-bit computers are faster. They are NOT! Even with 64 bit apps. Ask any IT guy. Even better, ask someone who's used UNIX, regardless of whether it's video or not. At best, it's a misconception fostered by computer manufacturers to boost sales, especially AMD.
Chienworks wrote on 11/10/2004, 9:20 AM
This is true to some extent. A 64 bit processor can grab and manipulate twice as many bits in one operation as a 32 bit processor can. However, there are diminshing returns. Most of the time a piece of software is manipulating small pieces of data. For example, an RGB pixel consists of 24 bits, while one with alpha consists of 32. It is unlikely that a more efficient algorithm will be developed that works on two pixels at once instead of one that works on a single pixel and is executed twice in all situations. Therefore, when calculating RGB a 64 bit processor is basically wasting 40 of it's bits.

OK, that was a huge oversimplification of the idea, especially if the calculations are done in floating point rather than in integer. But, at least it shows the concept that more bits isn't always more efficient. Where the cutoff line falls will be different for different types of data manipulation. 64 bits may very well work better for scientific and pure mathematical functions. 16 bits is probably overkill for text processing. For video work, maybe anything more than 32 ends up being less than useful.
TheHappyFriar wrote on 11/10/2004, 10:52 AM
i think the advantage for 64-bit in video work would be that you could process two pixels at once instead of 1 (assuming each pixel is 32-bit). That's only if something is wrttien that way though, and completely re-writing a rendering engine like Vegas's would probley be a pain in the ass (to make it run better anyway).

And I've used SGI's running Irix. Maya 1 on a single processor SGI O2 @ 200mhz ran just as good as a duel P3-400.

A bit is a bit to a computer, weather a weather prediction program (pun intended) or video. If someone writes it correctly then it will run better (or else we're all just using overclocked 386's anyway, which were 32-bit).