Post
Topic
Board Development & Technical Discussion
Re: A different approach to Bitcoin's scalability issues?
by
ETFbitcoin
on 02/06/2022, 09:18:29 UTC
There is no need to validate any precise value that will change anyway. When averaging over 2016 blocks the result will get close to average mempool size that other nodes see.
There is no point in having a requirement that does not have to be followed.

Additionally, it opens opportunity to,
1. Bloat blockchain (by claim have big mempool).
2. DoS attack on Bitcoin node (by claim have big mempool).
3. Network congestion (by claim have small mempool).

Using average mempool size is no different than using average difficulty. Some blocks mined in much less than 10 minutes, some in more than 10 minutes but the average value comes out fine. In addition there is no financial incentive for miners to attack blockchain.

Using average mempool size is no different than using average difficulty. Some blocks mined in much less than 10 minutes, some in more than 10 minutes but the average value comes out fine.
[/quote]

You could validate block difficulty, but how would you validate someone's mempool size? But since you said it doesn't need to be validate it, what would happen if majority miner claim have big mempool or majority miner claim have small mempool?

In addition there is no financial incentive for miners to attack blockchain.

Regarding maximum block size, there are possible financial reason to do it such as,
1. Lower block size could force people to pay more transaction fee.
2. Higher block size could be used to attract investor with PR "Bitcoin now is scalable and can adjust to network condition" which increase Bitcoin price.