You seem to think lowering voltage can guarantee higher efficiency. Show me some proof. Has it ever occurred to you that chip manufacturers may have already set voltage as low as possible without diminishing returns?
You repeat over and over that it has to work because "physics" and yet in reality there is not a single person with an asic who acheived less than 0.5w/gh at the chip level.
Here's some proof, oh yeah and BTW our chip on 55nm achieved that "less than 0.5W/Ghash" back in june 2013

Well that's about as solid proof as it gets. Thanks for proving me wrong without theoretical evidence.
0.38w/gh is incredibly impressive but I want to know is it still cost effective while only 1gh per chip?
And puppet this in no way validates your claims that any and all asic manufacturers can increase efficiency beyond what was advertised. Bitfury is a rare exception and even then 0.38w/gh was advertised I was just unaware of it.
Please note that it seems that 96 Mhz is close to _BEST_ solution
This leads me to beleive that even bitfury is towards the deep end of the shmoo plot.
How do we know other manufacturers are not?
And even if all manufacturers can acheive below 0.38w/gh can they do it cost effectively?
look at bitmain - the S1 is ~2w/GH which the S2 is 1w/GH, by running each chip at about 60% the power. Presumably less than 1w/GH could be done, but would require many more chips, which is cost-prohibitive for the time being