Sponsoring the resources required to handling block size is a TotC problem in itself (the fee for including a tx is given to the miner which included it; the burden on the network is placed on all nodes). Which is why I earlier suggested some block size limit (or something equivalent) remains. But I'll be generous and assume the marginal resource cost finds a way to fund itself efficiently (e.g., "grassroots" rejection of large blocks).
This leaves the problem of sponsoring hashing. If the marginal cost of a transaction is C, including a transaction with fee C+epsilon is pure profit. There is no reason for a small miner not to include it.
But I pointed out this isn't true. Including the transaction increases block size and therefore storage costs, increases time to propagate, and increases the chance of some other relay not liking it.
I'm disagreeing with the fundamental assumption you're making that it's "pure profit".
This is all supposed to be incorporated in C, the marginal cost per transaction. Some of what you mentioned is a bit abstract so there's room fo debate. But it is still the case that the costs you mentioned relate indirectly to the number of transactions; letting it influence the incentives to hash, which is independent, is neither robust nor efficient.
Also, whatever the costs are to the network, the fact is they're being borne by the network. If the network can't afford it, the network won't afford it. I haven't seen a convincing (to me) refutation of these points. If something isn't worthwhile people won't do it. If it is, people will.
My gut feeling is that the block size limit should be removed, and let the free market reign. It works elsewhere, why not for transaction processing?
It works when there's no TotC (meaning, the business pays for any externalities). When there is one (as we can clearly identify in this case), some method to resolve it is needed.
It's not clear to me, as I've tried to explain. Just like producing widgets - if someone can't do it profitably, that isn't a reason to seek out a central authority (here the arbitrary block size limit) to "solve" the problem (by specifying a minimum price in the case of the widget, or maximum block size for a block). They just go bust.
Sometimes it works, sometimes not. When incentives are aligned the invisible hand works. But when there's an identifiable systemic reason why some business can't be profitable - and we want these businesses to exist - there is need for some intervention.
This is even the case now - clearly block size is not a constraint at the moment, it might as well be infinite, as most blocks are way below the limit, and not even at the point where the reference client makes it "more expensive" to take up more space. And yet the network isn't suffering, and also not every transaction is getting accepted in the immediately next block. Because there are costs, lags, and delays, and that is their way of being expressed. Some people moan, but they're the ones who will stop trying to do it (including qt client users who aren't adding much to the network). If the "market" wants those users to keep doing it, the market will pay a higher fee to ensure they find it worthwhile.
The problem of sponsoring hashing hasn't even begun to manifest itself because of the coinbase, which is still more than sufficient for it. And txs are still few enough that the crude anti-spam rules (required fee for low-priority txs) are enough to keep the nodes from overloading.
Hashing is an artificially difficult problem and thus has an additional degree of freedom. Most things aren't artificially difficult; e.g., energy generation really is difficult, and if a more efficient method can be found to do it, people can enjoy cheaper energy. But if a more efficient way to hash is found, it favors attackers and the honest network equally.
By hashing, I was talking about the cost of rebuilding and rehashing the merkle tree, not the cost of hashing the block. Sorry for any confusion; I feel that may have led you to misunderstand my argument.
I understood this is what you meant. AFAIK the cost of hashing into the Merkle tree is negligible in comparison to ECDSA verification; and even if not, it's part of what I call marginal cost, as opposed to the amortized cost of hashing block headers.
The degree of freedom is exactly as you described: If there's not enough profit, miners will quit. Which is exactly what I'm worried about; with less miners the network will be more vulnerable to hashrate attacks. I want to make sure there is enough revenue to fund the amount of hashing required to secure the network; and I argue that left to itself, the TotC problem will create a race to the bottom with little total revenue and low network hashrate.
There are several suggested solutions. I'm not saying none of them can work, I'm saying the problem shouldn't be swept under the carpet.
I'm not certain an unconstrained block size can work. But I think it's highly likely it can, and I've not read anything to persuade me otherwise.
If the market wants something (here, more hashing power) it will pay for it, like anything else.
So why not give it a go? If it's a disaster, there will be no problem getting the 50% consensus to put one back, right?
In most cases gradual changes are healthier. Switching instantly to no-limit and then back to some limit can be disruptive. I think you're too charitable towards the free market, in those situations where it works it works great, but there are situations where it doesn't - or at least, where the free market works only by deciding to constrain itself to be less free.
The study of game theory brings up some examples, especially wherever bargaining is involved.