Post
Topic
Board Project Development
Re: Heating a house with old GPU's, worth it?
by
redtwitz
on 30/10/2013, 02:07:03 UTC
[...] GPU produce heat only as a (conceptually unwanted) by-product, while the heater produces heat as its main product. So GPU heat production per Watt is not comparable at all to a heater.

This is complete nonsense. Whether heat is conceptually wanted or not doesn't matter at all. A simple example is the traditional light bulb, which converts over 95% of the consumed electricity directly into heat. The visible light it produces gets converted into heat as well when it is absorbed by the bulb's surroundings (there are some exceptions), so while a light bulb is only 5% efficient as a light source, it is 100% efficient as a heater.

Energy cannot be created nor destroyed (Law of conservation of energy), so all the electricity your computer consumes has to get saved or converted into some other kind of energy. In the case of a GPU, all energy is converted into heat in the transistors and the electric circuits. Even a video card's fans actually heat up the ...

Lets also say we want to make our heater out of 6990's which have a TDP of 375watts and price of $370.
To get 17850 watts we need ~48 6990's at $370 ea = $17,760

To calculate how much heat every 6990 in the setup will produce, you have to consider how much energy the computers containing them will draw from the wall. PSUs aren't 100% efficient as a power converter; the "lost" energy gets converted into heat as well.

Calculating 65 W power consumption outside the video cards, we get a total of 1565 W for a computer holding four 6990s. If the PSU operates at 85% efficiency, the whole setup will draw approximately 1840 W from the wall, giving 460 W per video card.