Post
Topic
Board Pools
Re: [12000GH/s] p2pool: Decentralized, DoS-resistant, Hop-Proof pool
by
Krak
on 28/09/2013, 18:46:20 UTC
180 / 1.4 = 128 (almost exactly)

There's no point in using a lower difficulty with p2pool. You're just wasting your own CPU cycles.

That's about 71% of 180, not 30% of 180.

If you want to get technical, the best difficulty is 32,768 regardless of your local hashrate, because unless about a third of the users dropped out of the network, the difficulty per P2Pool share won't drop that low.  Every share found below the current P2Pool difficulty is useful only for local statistics.  Unless you're implementing a sub-pool that has a different share tracking method, those shares are wasted.
I don't know where you're getting the random 30% number, but the formula I posted is the tried and proven method for keeping your shares per minute at the sweet spot between accurate stat tracking and bandwidth/CPU savings.