Why not a adjustable limit based on a previous median timeframe?
Because on top of compromising decentralization for usability (the degree of compromise can and has been argued to death, and no there's no free lunch here) it introduces new attack vectors
Could you link that explanation for me, I'm not aware of how compromising decentralization could happen in this instance?
I'm usually pretty good at identifying attack vectors but I don't see the one here.
Ugh this again, it's rather simple, but lets try a flow chart
-nodes matter to decentralization -------------NO---> go to: Faketoshi and BSv
|
YES
|
\|/
-larger capacity leads to higher resource use ------NO---> go to: math class
|
YES
\|/
-raising resource reqs eliminates minimum req nodes (Pis can't handle 100mb blocks) ----NO---> go to: math class
|
YES
\|/
-Less nodes, less decentralization -NO-> go to: https://en.wikipedia.org/wiki/Decentralization
|
YES
\|/
-Voilà
Some big blockers went full retard and argued for unlimited blocks, enter Faketoshi with his enterprise data center nodes
Other like Richy_T argues for cost/benefit compromise, for multiple reasons market decided against this thus here we are
It will not be very many years before tech gets to where we can move the blocksize up.
So I think it is still worth talking about.
-Currently I am mining on a 32thread 16 core CPU I bought for $700. In a number of years 16 core cellphone processors will be a thing. (Not for mining for the most part, but for all the hashing done to validate things quicly enough)
-$400 will get you a 16T hard drive today.
-5G has Terabyte speeds at low latency WITHOUT WIRES.
Countries are coming online to bitcoin.
So this topic is not going away. That is for SURE.