Last weekend, I was discussing server farms with my good friend Glenn. In particular, we were wondering about their energy consumption. He sent me a link about Google server farms in the week.

Apparently Google have about 450,000 servers, globally scattered but locally clustered. According to the article, that number of servers would have an energy consumption roughly equal to about 200 megawatts.

A watt, of course, is one joule per second. Even the biologists in my readership know that. The point is that it's a way of averaging energy consumption over time. My lightbub says 100W on it (you can't use the energy saving ones with the dimmer switch in my lounge, unfortunately), which by my calculations means Google are costing the planet about two million full strength lightbulbs.

Is this right? Because it seems like bugger all, amounting to 1/2500 lightbulbs per human, and I'm sure Google must be more evil than that. You'd think it would take that much power just to monitor all the dissidents for the Chinese government. If anyone can fill in my uninformed prejudice with something more considered, this is the place to do it.

Also, who else runs an equivalent level of server farms? Flickr? Yahoo? Actually, that's the same thing. Which reminds me, Sean says you should all stop using Flickr, which he describes as the MacDonalds of photogalleries, and start using Gallery instead.

Also also, is it the case that server farms can basically be anywhere? Because if so, surely they could be where it's windy, wavy or sunny. With the obvious benefits thereof.

You tell me.