If you're worth your salt as a miner you should be dual mining (which you aren't doing).
Lol. Dual mining is not always the best way to go. You're pushing your personal opinion like it's a fact just the same way you're doing it with your "polaris are better cards than gtx 1070".

It's not as black and white as you're trying to paint it. Ideally we'd all have our own warehouses with virtually unlimited power and cooling, and then sure dual mining it is. But in the real world (where most miners live, including the ones that are "worth their salt"), there's always a power limit. For me, there's a couple locations I can put my rigs in "for free", but when I run out of space/power in those I can only rent apartments and put new rigs there. Renting is the cheapest way to add more rigs in my area, but it's still quite an expense. Cards are a lot hotter and louder in dual mining, consuming significantly more power, and it's just not worth it for me. I can run ~60 cards in dual mining and hit my power limit per location, or I can run ~80-90 cards in ETH only mining, hitting the same power limit in the same location.. the 2nd scenario results in more btc/day. I'd rather invest in extra GPUs and run them in ETH only than dual mine more profitable in my case, especially at the current crazy high crypto exchange rates.
Personal opinion...? Neither a 580 or a 1070 makes more without dual mining compared to mining ethereum straight (including power).
I'm not painting as black and white... lol, hence why we're talking about volume of coins, market stability, emission, and saturation.
You're trying to argue power costs. A 580 and a 1070 use about the same amount of power while dual mining. Furthermore, you're comparing using about $.10 at $.1KWH per day for an additional $1.3 (eth+sia) while dual mining. Electricity costs stopped mattering this last spring relatively speaking. They may matter again in the future, but as of right now you're literally taking a $1.2 loss because you don't want to spend more power. I'm sure since you're a miner worth his own weight you've already looked into this.
You'd have to have extremely expensive power to justify not dual mining. Either way a 1070 is using just about the same amount of power as a 580 while dual mining (tweaking your voltage reduces voltage on 580s as well). The only algo a 1070 can mine while reducing clockspeed (IE TDP) and get ahead without a proportional reduction to hashrate is Equihash, even then power costs in no way justify it.
Yeah, you're arguing cooling. If you're using AC this conversation is done.
Look into small warehouses in cheap parts of town. Your solution of 'filling rental units' with miners is completely asinine and not something most people will encounter. Right along the same lines, here's a better idea which is still stupid, rent a house with a 200 amp service. Most newer houses (80s forward) will have a 200 amp service. Once again it doesn't really matter as power usage is very similar and your scenario applies to both AMD and Nvidia.
There is no efficiency benefit to using Nvidia over AMD. MAYBE if you're comparing a 1060 while mining to a 580, sure... But we aren't, because once again the price classes for mining are completely different compared to mining. A 1070 will make less then a 580, cost more, and use about the same amount of power. There literally is no other way to break it down Nvidia is absolutely a terrible choice. The only people that are disagreeing are those that have no experience with both types of hardware, experience with mining multiple cryptos, or are stuck on this 'AMD has to be terrible at power lululueoolrer'.
And no, they aren't 'a lot hotter and louder'. That's written by someone who has no experience with dual mining, you're basing it off what you read in claymores thread, by other retards who are also regurgitating what they heard without actually taking time to test it themselves. And then 'oh my 22awg wire connectors melted, claymore is at fault, never dual mining again!'. This is why most of you aren't worth talking to.
I'm almost 100% sure this is fixed with GCN 2 and higher cards. I have not seen this with Claymore miner on my AMD rigs, they're still mining at pretty much exactly what they were when they first came out. If you're getting subpar hashrate either your memory speed is lower, you have bad type of memory, or you haven't tweaked your latency properly. Hawaii is GCN 3, Polaris is GCN 4.
If it isn't fixed, do you have a source besides 'you can see it with claymore now' which isn't observable?
you are either lazy or stupid. This is the biggest news in mining and discussed and admitted everywhere. Just use --benchmark option in claymore and you'll see that polaris cards are loosing 0.1-0.2 mhs from each new DAG. This started 2 epochs ego.
Nice source. That's sarcasm, you literally didn't list one and cited 'common knowledge' for something I'm not finding sources on. Cards do lose hashrate per epoch, all of them do. That has to do with dealing with bigger and bigger epochs, it's nothing exclusive to AMD.
But since it's 'discussed and admitted' everywhere I'm sure you can easily find me a source for this, right? XD