OT: Green Computing and Vegas

Cliff Etzel wrote on 1/8/2008, 11:56 AM
Got to thinking lately about this concept of the hardware requirements for Vegas and building a small form factor desktop for editing that has low power consumption.

True, quad core is the way to go for long form editing - in HDV, but many of the projects I shoot aren't more than 5-10 minutes max right now and since getting Gearshift, I haven't felt too compelled to edit native m2t files.

I've been toying with the idea of building a micoATX form factor editing station that uses alot less energy - specifically built around the AMD Athlon X2 BE-2400 Brisbane 2.3GHz AM2 45W Dual-Core Processor (Possibly overclocking it by 10% or so)

The idea being to see how low can I go in power consumption and finding the balance in efficiency in editing versus the amount of power being consumed. As far as I can tell, the AMD Brisbanes are the only CPUs that use 45 watts of power.

Secondly, since I don't do any gaming, are there truly any disadvantages in using an integrated video card while editing with Vegas? Especially if using x64 XP Pro with 4GB or more of RAM? Since VP8 doesn't utilize the GPU on a dedicated graphics card, and I don't work with any compositing apps that might require a dedicated graphics card, having an integrated video card seems to be a non issue.

The idea was sparked after seeing this posting on Gizmodo.

Combine this with a wide screen LCD monitor = which are reported to use as much as 1/3 the power of CRT's, I wonder just how much savings there would be in power usage.

Here's what I have spec'd out on a low power consumption editing machine:

AMD Athlon X2 BE-2400 Brisbane 2.3GHz AM2 45W Dual-Core Processor
GIGABYTE GA-MA69GM-S2H AM2 AMD 690G HDMI Micro ATX AMD Motherboard
4GB Corsair Dual Channel RAM (Already Have)
Seagate 160GB 7200RPM IDE boot drive (Already Have)
NEC CD/DVD Burner (Already Have)
Seagate 320GB SATA editing drive (Already Have)
Antec Power Supply - 480 watts (Already have)
Case - TBD

The other IDE drives I currently have I can put into enclosures as needed since this board only supports 1 IDE port (2 IDE devices)

Any thoughts?

Cliff Etzel - Solo Video Journalist
bluprojekt

Comments

Kennymusicman wrote on 1/8/2008, 2:14 PM
My initial response would be: Laptop.....

Afterall, generally designed to utilise low power.
farss wrote on 1/8/2008, 2:25 PM
You can end up chasing your tail trying to reduce your carbon footprint. What you need to look at is efficiency not power consumption. A figure of joules per frame is the only way to make the comparison. A low power machine may use more energy to get the job done than a high powered one...or not. Of course if you also leave the machine spinning it's wheels a lot of the time then things get even more complex.

If you dig around some of the BOINCing fora you should find a bit of info on this, one of the paramaters people judge their rigs by is how much energy per calc is used. I'm told some of the answers are not in line with what you might expect.

Bob.
Cliff Etzel wrote on 1/8/2008, 3:00 PM
Bob - as always you provide food for thought.

I kind of wondered if going this route was the answer. I read this Information Week article and was trying to determine if much of the same information would apply to how we edit video.

Your analogy does seem to make sense - if leaving a lower powered computer running longer to render defeats it's low carbon footprint as compared to using a CPU that uses more power, but in the process, renders a final project out quicker - if this actually isn't more energy and time efficient.

Since I leave my machine running 24/7 I was looking at this from an overall cost savings in energy in my home office. It may be I need to look at things like an LCD monitor which is reported to use 1/3 the energy of a CRT. The price on a 19" widescreen LCD is very reasonable right now. Or maybe I just need to buck up and go Phenom quadcore...

Cliff Etzel - Solo Video Journalist
bluprojekt

johnmeyer wrote on 1/8/2008, 9:48 PM
While well-intentioned, you are wasting your time. Bob has hit the nail on the head, namely you need to look at watts/project or some other ratio metric. If a computer consumes twice as much power, but renders three times faster, guess what? You'll use less power (at least while rendering). Also, if you are serious about reducing power consumption (whether you want to be "green" or whether you just want to save money), you need to look at ALL the energy consumed in your home or business and apply your efforts to where it will do the most good. People do the silliest things, mostly just to make themselves feel good, without understanding that the only thing that matters, for electricity consumption, is killowatt hours. As an example, I had my refrigerator recharged (and repaired so it didn't leak refrigerant) and I cut my monthly electricity bill in half!!. That's pretty extreme, but it demonstrates what happens when a high consumption item operates for many hours a day, every day. A computer certainly operates long hours, but is only a few hundred watts, at most (although if you get everything cranking, it can be more).

If you are really serious about reducing energy consumption, you should purchase and learn how to use a Watt Meter. This is the ONLY way to tell how much power your computer is actually consuming (the power supply nameplate tells you nothing about actual power consumption).

farss wrote on 1/8/2008, 10:46 PM
I'll add another tip. Checkout your wall warts / plug packs or whatever those black lumps get called in your area. Almost every bit of video gear seems to have them and even though you switch off the gear those things are still drawing power and doing nothing but keeping the roaches warm. Some seem to use as much power with the gear off as when it's on due to poor design. Pluging them into one powerboard with a master on/off switch could save a few dollars over the year and remove another fire hazard when your home is unattended.

Bob.
Chienworks wrote on 1/9/2008, 4:39 AM
Back when 15" LCD monitors were still costing $400+ we started switching to them. The break even point due to electricity savings was only about 13 months, and the projected lifespan is about 4 times that of CRTs. It was foolish *not* to switch. Now when you can get a 20" for about $200 it's a no brainer.

Also, turn that monitor off when you're not using it, turn off the wallwart if it has an external power supply. I was always astounded and dismayed walking around town at night seeing offices and classrooms with dozens of CRTs showing screensavers all night. What an incredible waste. A couple of the local colleges still do that in their libraries and computer labs.