Post
Topic
Board Development & Technical Discussion
Re: Improved Measurement of Proof-of-Work using entropy
by
n0nce
on 12/03/2023, 00:30:08 UTC
~
What do you think about takeaway/explanation by one of the author? https://twitter.com/mechanikalk/status/1633694566200094723
Quote them here, please.

New paper on novel mechanism used in @QuaiNetwork to scale blockchain. Proof-of-Entropy-Minima (PoEM). TL;DR blockchains are time invariant static sequence of commitments. PoW is massively inefficient bc it doesn’t do the math correctly. #btc #ETH #Crypto

Some more takeaways:
1) entropy reduction is the goal of work
2) deltaS=1/2^n defines blockchain
3) All PoW vulnerabilities arise bc #2 is not used
4) 67% of all orphans are completely avoidable with PoEM.
5) The best hash algos reduce entropy most efficiently while preserving one-way collision resistance w uniform distribution.
6) Using time to describe a blockchain is completely incorrect
7) Withholding and reordering attacks only exist bc difficulty is computed incorrectly
Cool correct computation of head entropy makes perfect head choices, even with withholding and stales, without any secondary measurement or consideration
9) Satoshi added a friendly finalization mechanism to #BTC. Coercion of intrinsic bits to threshold difficulty. Without it #BTC would not have any finite guarantee of finalization.

The point about 'PoW is massively inefficient' is immediately wrong, in the sense that as a general concept, if you make it more easy / efficient to calculate valid block hashes, the difficulty would adjust upwards to keep the ASICs running for 10 minutes on average to find a new block. Therefore, the energy consumption will not be reduced.
This holds true for any system or proposal that aims to make mining more efficient. Even reusing heat is just an efficiency improvement that leads to more money / value extracted from the same electricity, driving up the difficulty to account for it.