Hello this is GPGPU, not gaming. You have a common algorithm, not some binary blob that vendors create application profiles for by introducing all kind of tricks to improve performance. No miracles and no marketing gibberish, just simple maths. You have 1536 shaders clocked at 1008 versus 2048 shaders clocked at 925. With GCN, both architectures are now scalar and comparisons are even easier because bitcoin employs a simple algorithm that is extremely ALU-bound and not memory-intensive and does not involve lots of branching. In the best case where NVidia implemented bitwise rotations and bitselect, difference performance-wise would be ~22% in favor of 7970. And this would also likely require a rewrite of the NVidia miners. As far as mining is concerned and as far as the TDP and prices announced until now are correct, there is no way 680 becomes a better alternative to 7970. Not even close.
What about the card that is supposed to have 2304 shaders ?
GTX 680 is not the flagship I think. If it is then DAMN, Nvidia screwed me again
I really was hoping to go AMD-free this time but it seems like a no go if the GTX 680 is all they have to show for Kepler.
EDIT:
http://www.legitreviews.com/news/12673/ Looks like there won't be a Nvidia dual GPU monster with 4608 shaders

Also to note are the TFLOPS power stated for single precision ...
So it seems like a hard choice between GTX 685 and 7990 because the Nvidia dual-GPU will only have 3072 shaders
Maybe this year AMD = best dual GPU with 7990 ( 4096 shaders )
Nvidia = best single GPU with GTX 685 ( 3072 shaders )
all this for mining purposes. Am I mad or what
