I want to see how much additional energy it takes to cool per rig watt.
For example. I wish to keep my rigs in an ambient temperature of 80, when it is 105 outside. It almost seems to me that more energy will be spent pumping the heat out of the room than will be spent on power the rigs.
I don't remember physics that well, but I thought at least an equivalent amount of energy would be spent to remove the heat and even more because of the huge inefficiencies in cooling ?
Anyone good at physics want to answer this ? Point me to where someone writes it up ? The link above was more about data centers and not the direct cost related economics of it as they relate to profitability. (and environmental impact
Anyone have an educated answer to this ?
This is commonly expressed in PUE - Power Utilization Effectiveness - in the data center world. Maybe not the most accurate yardstick, but it measures the overhead of cooling required to cool the corresponding compute load - 1.5 PUE means that for every watt you spend on IT load, you will spend 50 cents in cooling costs. If you look at a data center contract, there is a 'cooling uplift' charge - this is to cover the PUE and associated electricity overhead to provide cooling.
Immersion 2 cooling is below 1.1 meaning that you spend less than 10% electricity overhead on cooling, and 400% less than a typical water or air cooled deployment.
There are two practical ways to mitigate costs - find the most efficient way to manage the concentration of heat from a rig and find the cheapest electricity you can. If that's the goal. One watt generates 3.41 BTU and it is the heat signature that is the issue and insuring as little power goes to run cooling as possible.
Does that help?