The article is all about the cost of the hardware, neglecting the more significant cost: electricity.
Once you're above baseline power of 11 kWh/day (as any geek is), Southern California utilities get about $0.13/kwh marginal, with taxes, distribution, etc.
This is a calculation that depends highly on who you are and where you live. I live in an area that recently had a 10%+ residental electric rate hike, to about 8 cents per KWH. This is only
slightly more expensive per btu than using natural gas with a 90% efficient gas heater versus a 100% efficent electric heater. So the price difference for me to run any computer full tilt during the heating season, which is most certainly longer than Southern California, is about half a penny per kilowatt or less. I don't even know anyone who bothers to shut down their computers from September to May to save money. There's also someting to be said for the soothing white noise of a (good condition) cpu fan as the beast in the corner crunching numbers keeps your bedroom a couple degrees warmer so that you can turn the house thermostat down to 69 degrees at night. I can't prove it, but I would bet that I actually
save energy doing this, because otherwise my wife would insist on turning up the heat.
That's interesting. If you have no screen on then all of the energy spent is converted to heat right? So if you are heating the place anyway it is costless?
Heh, so bitcoins will be a cold climate manufactured good, lol.