Post
Topic
Board Securities
Re: ASICMINER: Entering the Future of ASIC Mining by Inventing It
by
raskul
on 28/04/2014, 18:08:23 UTC
Quote
Everyone else understands that 20nm, if ever they get it working and yields become reasonable, should provide lower production cost per GH as well as better power efficiency

I'm dense? Your repeating the same useless argument over and over that they "could" increase efficiency but haven't.

If they can't acheive less than 0.5w/gh while still below $0.2/gh wafer cost then it is meaningless. And I doubt they can or they would have already.

And does "everyone" include nvidia? They seem to think 20nm wafer costs are too expensive.

Quote
simply by lowering clocks and voltages from near the top of the schmoo plot to somewhere lower. The same will work for your GPU, for your CPU (both of which will in fact do this automatically when mostly idle)

You seem to think lowering voltage can guarantee higher efficiency. Show me some proof. Has it ever occurred to you that chip manufacturers may have already set voltage as low as possible without diminishing returns?

You repeat over and over that it has to work because "physics" and yet in reality there is not a single person with an asic who acheived less than 0.5w/gh at the chip level.

Why? Laziness?

dude, you really could cause an argument in an empty room. chill down.