Typo. I was saying at worst the PSU was 90% efficient. Still the efficiency curve on modern PSU is very very flat almost a horizontal line.
That's misleading
The difference between 50% load and 80% load is usually 1%.
Between 50% and 80% maybe but look at the chart references on this thread a few posts back (not by me so I'm not cherry picking).
http://www.techpowerup.com/reviews/CoolerMaster/V850/5.htmlIf you go up to 100% you are down 2% from 40% or 50%, not 1%. But this is again somewhat misleading, see below.
It isn't going to run significantly cooler, last significantly longer, or save any meaningful amount of power.
It will save about 2% from running at full load or 1% from running at 80%. Even running at 80% is better than 100% though. If you are pulling 1000W 24/7 and paying 5c/kWh then the difference in power alone is $8/year, ignoring the power supply itself running cooler. If you are paying 15c this is $24/year.
Now when it comes to heat produced within the power supply, that ALL comes from the losses. If a power supply is 90% efficient, that is 10% losses. At 88% efficiency that is 12% losses, so 20% more heat, not just 2%. That's definitely significant to the temperature and cooling of the power supply.
I wouldn't pay a lot more for a bigger power supply but I would definitely pay somewhat more.