I think 1070 loses efficiency powering the 8GB of memory, the 1060 3GB should be best, except for in ethereum.
Probably, but it's hard to gauge just by how much. I think the number is miniscule and it's nice not to get locked with 3-4GB in the future as I'm not planning on selling 1xxx series cards in less than well over a year.
I could only find estimates on memory power consumption which vary wildly:
http://www.anandtech.com/show/10193/micron-begins-to-sample-gddr5x-memory (1/3 of the page)
Based on other
sources older cards consumed 4.35 W per gigabyte of GDDR5 (17.4W for 4GB and 34.8W for 8GB) but the same source says it's closer to 20W or slightly more for 8GB.
There are also much higher numbers, like 50W for an R9 290X.
isn't it watt per die instead of eg. 1 GB

I don't know. Maybe that's one of the reason why the numbers are so different.
It would be per die... The amount of power memory consumes is pretty small compared to the entirety of the card. Some people also forget that Ethereum is not the only coin in existence and building your ecosystem around one coin that will be getting hit hard in the coming months is not such a great idea.
Nvidia is also not responsible for the kind of memory manufacturers use. That would be up to manufacturers such as MSI/Gigabyte/Asus.
Ethereum continues to give the largest % of mining revenue by far, without it we can all throw our cards away. Also 30W/card would not be negligible.
Only NVIDIA pulls this memory switching shit every generation, your argument fails because AMD and its AIB partners don't do this.
Ethereum is going away this Januaryish as it shifts to PoS.
Only Nvidia switches memory? lol... When I mined with AMD I had three different manufacturers. Hynix, Elpedia, and Samsung from four different card manufacturers. Some changing between the same cards and models. They do it because they find a better partner that can source memory for cheaper or better to them. Mining is a small drop in the bucket when it comes to the entirety of GPUs.
As I mentioned AMD/Nvidia have nothing to do with who the memory is sourced from, only the type that is used (GDDR5/X, HBM). That's all the card manufacturers.