Here is the specs
https://bitcointalk.org/index.php?topic=495357.0. Again you need evidence that lowering voltage to increase efficiency is possible and because it works with cpu/gpus doesn't necessarily mean it has to work with bitcoin asics. I have yet to see bitmain/hashfast claiming anything below 0.6w/gh which they would happily do if it were possible. I assume they have already tested the chips to find out the maximum efficiency so they can advertise such. Why would they not?
Maybe if you tried reading a bit more carefully, I wouldnt have to repeat myself over and over. How many times did I explain that to increase power efficiency you have to lower the voltage? Didnt I specifically say "Got a link showing that power effiency does not increase quadratic with voltage?" Now, where in your first link does it show they changed the vcore?
Nowhere. for whatever reason, that poster only changed the clock. Did he not have access to the vcore settings, did he not bother trying, is it a firmware or PCB issue preventing him from changing it, I dont know nor do I care. In no way does it refute my "theory".
As for Bitmain, I never said they could achieve <0.6W on a 55nm design. Given that they already claim 0.68W/GH at the chip level, they probably can, but only I gave them as an example of doubling power efficiency without as much as a chip revision, simply by lowering clocks and voltages from near the top of the schmoo plot to somewhere lower. The same will work for your GPU, for your CPU (both of which will in fact do this automatically when mostly idle) and for pretty much any asic with programmable clock ever created because its a direct result of effects inherent to CMOS technology combined with Ohms Law.. If you dont believe me, see if I care.
I understand very clearly the difference. What I don't understand is why KNC would RUSH to the newest node size (spending more than necessary simply to be first) when they could simply lower voltage and save millions? Wouldn't it make sense to wait until 20nm is cheaper since production cost is nowhere near a limiting factor as of now? Only reason I can think of for doing this would be that they are limited to 0.6w/gh (at cost effective $/gh).
My god you are dense. You are the only one who ever claimed that the only reason KnC is moving to 20nm is to achieve <0.6W/GH. Everyone else understands that 20nm, if ever they get it working and yields become reasonable, should provide lower production cost per GH as well as better power efficiency.