Anyone ever heard of a datacentre using equipment other than air conditioning, I haven't.
For peace of mind I run my expensive hardware investment with an air conditioner that will always keep the room at 19 celcius - I can go to work, go on holidays, watch tv and be assured that my machines are at a steady temp.
My rigs use 5kW/h and my air con can cool 7kW of heat with just over 2kW of power. Why invest big and then skimp on cooling? Cooling needs to be factored into the cost and is as important as power surges or dirty power - it will destroy your hardware!
With air conditioning you can't introduce separate systems as it messes with the airflow. My GPU's on my 6990's run at around 70 celcius or under continuously. This is with 3x 6990's in each rig running at a standard 830Mhz per GPU, they are also running inside haf x cases with a modified side panel (professionally designed and cut via an engineering company) to fit 4x120mm delta fans for air flow. These fans have dust filters also. Each rig uses about 1.2kW/h and there are four in total. Just to add the fan speed set on the 6990's themselves is 70%, though it can manage no problem at all on 60%.
If your serious then professional is the way to go. Jobs isn't still running his company out of his garage. Business (i.e. investment) requires industry standards. Otherwise it's a joke.
I see people running custom cooling solutions and are just about keeping them under 85 celcius. What if it's a hot day and they pop up to 95? Do you panic? Do you call in sick to work? Do you fly back from your holidays?
I am also considering down clocking my GPU's rather than having them maxed out at 830Mhz all the time - a few less hashes isn't going to kill me. With regard to overclocking, I wouldn't consider it. Cards are certainly not designed to be maxed out all the time, never mind overclocking them.