I don't know precisely what proof of work entails but if it needs to be beefed up or elongated to slow down the rate at which btc is generated then surely the 'filler' could be some computation that is of greater general use? Like folding proteins or working out further digits of some transcendental number or something! Perhaps I can't see the wood for the trees but it seems extraordinary that wastage should be intentionally built into a system. I'm not picking on bitcoin in particular, just in general the idea that it is self-imposed perplexes me.
Basically, a very simplified way of looking at it is that you have to find a particular value of
n such that
f ( n ) returns a desired value (you nit-pickers in the back row keep quiet! I
said this is simplified). The function
f is designed in such a way that changing the value of
n even the tiniest bit gives a completely unpredictable result;
f( 1 ) might give 33929494, and
f( 2 ) might give 4. So the only way to find the value of
n that solves the problem is by brute force: try each possible value of
n until you find one that works. The value of
n that solves the problem is called the proof of work.
The function
f and the value
n have an important use in Bitcoin, so the solution is not just some pretty value that gets put on the fridge for a few days then quietly gets buried in the recycling bin.
Your idea of using 'useful' calculations to maintain the rate is interesting, but I'm not that it can be worked in. The problem has to fulfill two criteria. First, it has to be something that can be dynamically adjusted to compensate for more and more computers coming online to solve it. Second, the problem must be difficult to solve but easy to verify the solution, because every client will verify the solution at least once, if not several times. And, let's face it - we're dealing with money. A lot of people will be very motivated to claim they found a solution, so we must be able to check their claim quickly and easily.
'Useful' calculations could probably meet the first criterion by varying the number of folds, or the number of digits to solve, and so on.
With the current setup, finding the solution is very hard, but checking whether the solution is correct is, comparatively speaking, trivial. If someone says "Hey, the answer is 542214675," you don't have to try all the values from 1 through 542214674 - all you have to do is plug 542214675 into
f and verify that the result meets the requirements. I'm not sure the same thing can be said about folding proteins, extending a transcendental number, and so on - I suspect you would have to duplicate the entire work effort to verify its legitimacy.
Surely after a while security of the network stops becoming an influencing factor. There reaches a threshold where you can clearly say 'the system is secure'. From what I can gather this is at some point where 50% or more of the computational power of the network is not within the control of a single user? I don't know where this point is, I'm sure this figure is completely wrong, but equally I'm sure the point was passed ages ago when bitcoin was in its infancy, when security was still a legitimate concern.
By this I mean, the security of the system increases as more and more user mine bitcoin but the advantage to security of having 100 million unique miners is, in truth, of no real difference to having just 1 million.
There is a point where improving security is pointless. After this point the energy expended by the surplus bit miners can be considered to be wasted if the function of mining is to check and secure the bitcoin network.
If more a more people became interested in bitcoin mining then the wasted energy will increase exponentially, especially if as I understand it, the difficulty of mining those bitcoins increases as the number of bitcoins tends to 21 million.
I'm not sure that protein folding can be dynamically varied in difficulty but I believe that it is possible to estimate how long it will take to complete the task to some degree of accuracy. Folding@home works by distributing work units to clients who then return the units to the server along with a digital signature. I'm guessing this doesn't work for something decentralised like bitcoin.
I don't claim to have a solution of how this surplus processing power could be better utilised, but at the moment it seems a tiny bit crazy that all those processor cycles are doing nothing more than solving puzzles of no consequence. The only purpose these processors serve is to actuate the bitcoin system. Something that could be done using far less energy as security, after a point, is not a concern.
That is what I mean about inefficiency as being a self-imposed feature of the way bitcoin is designed.