In keeping up with this thread, I have seen two or three posts saying that shares/minute is a better indicator of performance than MHash/second. However, I don't understand this argument.
I understand that higher MHash/s doesn't do any good if the GPU is unstable and shares aren't found or shares are rejected, but assuming a person keeps the GPU stable, doesn't the shares/minute vary with luck and difficulty while the MHash/s is what is actually being processed (regardless of luck and difficulty)?
IOW, when a person goes from 310 MHash/s to 320 MHash/s and shares/minute drops from 4.15 to 4.10, it seems logical to me to assume that the 320 MHash/s is better and the change in shares/minute is luck/difficulty related (at least assuming when comparing two relatively short runs [less than a day for sure, a couple days? I'm not sure]).
I'm not saying I don't believe shares/minute is a good thing to pay attention to as well, though. Obviously if the same scenario involved going from 280 MHash/s to 330 MHash/s and suddenly getting 3 shares/minute instead of 4 shares/minute an issue with the miner or stability would be indicated.
That said, am I misinterpreting something, or is the suggestion that shares/minute is more important than MHash/s oversimplified?
The reason for taking shares/min into account rather than just MHash/s is because MHash/s is only part of the equation. You have to take into account accuracy and efficiency. You can compute hashes all day, but unless you're computing the right ones it's not going to do you any good. So cgminer found a way to still maintain a decent hashing speed while also making sure that the hashes it computes are actually worth a darn. See the full picture?