That's true. Last time i tried that with Bitcoin Signet, i got compression ratio 0.63 (505MB / 798MB). For comparison, compression you choose have ratio about 0.81 (397GB / ~486GB).
This took over 2 hours on all CPU cores already.
And with LZMA2, i expect it could take few days.
I think that when the technology of data storage has evolved by changing scale, i.e. for example when the price of 10TB is at the current price of 1TB, it may be possible to increase the size of the blocks as this will not harm decentralisation, however we may be close to the theoretical limit in data storage
theoretical limit in data storage (in general) or just theoretical limit of current technology used on HDD? Even if you're talking specifically about HDD, theoretically we can see at least 80TB capacity[1].
--snip--
well, set aside the pride and all. if you are not a developer, there's no reason for you to download such heavy data. am using electrum right now for my btc holdings. light and txs are fast as well.
I'll have to remind privacy concern is also valid reason to run full node. For example, you don't want random electrum server know list of address belong to same person.
[1]
https://www.pcmag.com/news/next-gen-hamr-platters-promise-80tb-hard-drives