Has anyone calculated the marginal Watt per increase in GH/s?
Say you clock from from chip_freq 393.75 (202GHs) to chip_freq 400 (205GHs).
The increase in GHs is 3. If this requires 12W more, then the marginal Watt/GH/s is 12/3 = 4. Hardly profitable (depends on $/kWh).
I think mining has reached the point where overclocking may no longer pay off. However, it needs to be tested.