I thought I had made my point pretty well. You are proposing to replace something that is really simple, always exact, and impossible to mess up with something that has none of those properties. People will get it wrong.
The current block reward calculation isn't any more exact than what I propose - when the block reward is halved, the amount of satoshis is rounded down.
It's hard for you to make a case about simplicity taking as an example a minute detail about one of the simpler aspects. If multiplying two numbers gives us troubles, I worry.
Don't take this the wrong way, but you
did get this "minute detail about one of the simpler aspects" wrong. This is an extreme example, one that I often choose because it is so trivial and so simple, and still so easy to get wrong. (See my posts in various other threads with proposals to change up the subsidy calculation.)
"Right shift by 1 bit" is not the same operation as "divide by two and round down". They obtain the same result if done properly, but have different meta costs, as you helped me demonstrate. That extra cost should not be paid (now and for eternity) without good reason. The bitcoin software is chock full of things
far more complicated than this, so clearly we
can get this detail right too. But
should we?
Do you get what I'm trying to say?
P.S. Read the bold part again. Warning bells should be going off as you do.
Not really. If I were able to predict what the market would do I'd be rich. I just want it to have the tools to find what's most efficient for it, and that's easier to demonstrate.
It is relatively simple to show that 5 seconds between blocks is too short, and that 3600 seconds between blocks is too long. It is much harder to show that 590 or 610 seconds is better than 600, or that 300 or 900 are better than 600.
As you say, it is hard to predict what this system will look like once it gets rolling. Will we be trading one non-optimal block average for a different non-optimal average? Will we be making a system that cycles or floats through a range of values, each just as non-optimal as the others? How can we tell a "good" state from a "bad" state?
Usually I make these points in threads about moving the decimal point, but they really apply to all of the magic numbers. Why 10 minutes? Why 2 weeks? Why 210,000 blocks? The answer is always the same, because they are "good enough" and it is
really hard to show that any different numbers would be any better.
I suspect that the same applies here. It seems to me that it is just as hard to show that a dynamic system is any better for this job, and it doesn't just have to be
any better, it has to be better
enough to pay for the extra complexity costs.
All that said, unless you want to discuss these things further, I'm willing to bow out here and let you get this thread back on track so that you can finish presenting your idea. I really would like to see the rest of it.