Search content
Sort by

Showing 20 of 992 results by tromp
Post
Topic
Board Development & Technical Discussion
Re: Blockchain Consensus Mechanisms: The Engine of Trust in Web3
by
tromp
on 11/08/2025, 17:52:08 UTC
How to Choose the Right One
[\quote]
Need Fair Distribution? → PoW
Post
Topic
Board Development & Technical Discussion
Merits 4 from 1 user
Re: Bitcoin must upgrade or fall victim to quantum computing in 5 years
by
tromp
on 07/08/2025, 19:26:22 UTC
⭐ Merited by vapourminer (4)
In order for a quantum-computer to pose danger to Bitcoin it would require to have between 100000 - 1million logical qubits!
https://eprint.iacr.org/2021/967.pdf shows that 2330 logical qubits suffice.
Post
Topic
Board Development & Technical Discussion
Re: Bitcoin Core October Upgrade
by
tromp
on 06/07/2025, 17:39:11 UTC
You can have thousands, millions, or even more users, making their transactions on second layers, having all of that batched, and finally pushed on-chain, as a single transaction with high fee rate.
I don't see why the fee rate would be high. Payment channel settlement transactions just pay nominal fees, no matter whether they result from a handful of txs, or thousand of txs. This is not a case of packaging a bunch of txs into a batch that preserves all their fees, but of one final settlement tx obsoleting all earlier ones.
Post
Topic
Board Development & Technical Discussion
Merits 2 from 2 users
Re: Tail emission ideas that retain the 21 million limit
by
tromp
on 06/06/2025, 07:11:57 UTC
⭐ Merited by stwenhao (1) ,vapourminer (1)
Having a finite total supply is one of biggest strengths of Bitcoin compares to overwhelming thousands of shitcoins on the market.
Finite supply is overrated [1].

Quote
If you need any example, see how the Purchasing Power of US. dollar and other fiat currencies drop dramatically over time as consequences of inflation.
Fiat is a bad example, since unlike most cryptocurrencies, it's not even disinflationary.

[1] https://tromp.github.io/blog/2020/12/20/soft-supply
Post
Topic
Board Development & Technical Discussion
Merits 1 from 1 user
Re: Ring signatures (monero-style) in Bitcoin: is it possible?
by
tromp
on 14/05/2025, 21:15:17 UTC
⭐ Merited by stwenhao (1)
a) what about the efficiency in terms of fees and speed?
Basic ring signatures take up space proportional to ring size which makes them rather inefficient.
Newer designs [1] get by with logarithmic size which support large ring sizes much more efficiently.

But in either case the real efficiency problem is the impact on UTXO size. Since one can never tell which is the real input and which are the decoys, no output can be known to be definitely spent. So the UTXO set balloons to the entire TXO set, with very detrimental impact on node efficiency. It's not so noticeable on Monero yet because daily tx volumes are about 15x smaller than Bitcoin.
Zcash suffers from the same problem, but with only 10% of Monero's tx volume, it's even less noticeable there.

[1] https://eprint.iacr.org/2024/921
Post
Topic
Board Development & Technical Discussion
Merits 2 from 1 user
Re: Removing OP_return limits seems like a huge mistake
by
tromp
on 13/05/2025, 20:35:21 UTC
⭐ Merited by d5000 (2)
That experiment effectively stored a NEGATIVE amount of data in Mimblewimble transactions. In order to locate each 2.5 bytes of data, the extraction needs to know the ID of every kernel in which data was stored, and their correct order. The data extracting program [1] uses 70 bytes of source code to locate each 2.5 bytes of data.
Yes, I knew about that. That's why I'd be interested if there were attempts which were successful in the sense that non-negative storage was achieved, even if it was expensive.
Not only have there been no successful attempts that I know of, I'm having trouble thinking of a good way to do so. But I am curious to see if other people can come up with one.

As a way of encouragement, let me offer a $100 reward for any scheme to store (10KB + X) of arbitrary data on a pure Mimblewimble chain using at most 100KB of total transaction size + X extraction data size, with extraction running fast and not revealing keys that allow the pruning of stored data. This corresponds to a >= 10% storage utilization rate.

The reason for quantifying extraction data is to make sure the scheme can scale up by chaining, for instance to storing (100KB + X) of arbitrary data by using at most 1000KB of total tx size (the basis for calculating fees) plus the same X extraction data size.

I'll double the reward for an actual demonstration on Grin testnet or mainnet.

Quote
I wonder however: couldn't be the same techniques to use for "scriptless scripts" also work for data storage?
There's several ingredients to make scripts scriptless. One is adaptor signatures, which relate on-chain signature scalars to off-chain signature scalars. With 3rd parties having no access to the latter, I fail to see how this can provide for any on-chain storage. Another ingredient is kernel locktimes, which provide around 24 bits of storage for chosing arbitrary lock times in the past. Which is only about 3% of the kernel size, so still rather limited.
Post
Topic
Board Development & Technical Discussion
Merits 2 from 2 users
Re: Removing OP_return limits seems like a huge mistake
by
tromp
on 13/05/2025, 06:35:26 UTC
⭐ Merited by vapourminer (1) ,JayJuanGee (1)
I wonder if even on chains like Grin it was possible to store (much) more data than in this experiment
That experiment effectively stored a NEGATIVE amount of data in Mimblewimble transactions. In order to locate each 2.5 bytes of data, the program for extracting it needs to know the ID of every kernel in which data was stored, and their correct order. The data extracting program [1] uses 70 bytes of source code to locate each 2.5 bytes of data.

This shows the huge challenge of storing data on a Mimblewimble chain that destroys all ordering information of all outputs and kernels in a block.

[1] https://github.com/NicolasFlamel1/MimbleWimble-Coin-Arbitrary-Data-Storage/blob/master/main.py
Post
Topic
Board Development & Technical Discussion
Merits 6 from 4 users
Re: Can Quantum Computers capable for guessing BIP39 Seed Phrases?
by
tromp
on 12/05/2025, 20:51:51 UTC
⭐ Merited by ABCbits (2) ,hosemary (2) ,HeRetiK (1) ,mcdouglasx (1)
A quantum computer only has a quadratic advantage in cracking the hashing based wallet security.

In any case the seed phrase of your wallet should not be your main worry. A scalable quantum computer will be used to drain all utxo with known public keys, which will collapse the bitcoin price and make your wallet nearly worthless even if its specific keys are not yet cracked.
Post
Topic
Board Development & Technical Discussion
Merits 2 from 2 users
Re: Removing OP_return limits seems like a huge mistake
by
tromp
on 12/05/2025, 16:58:33 UTC
⭐ Merited by d5000 (1) ,ABCbits (1)
The only resistant chain I know of is Grin, because users cannot control, how their data pushes will be represented on-chain, so they will be shuffled in the process, which will make them useless.
The reordering of tx outputs and signatures is one reason for spam resistance (item 3. in [1]), but not the main one. Which is the fact that there's very little room for spam to begin with (items 1.,2.,4.)

[1]
https://forum.grin.mw/t/ordinals-on-grin/10336/2
Post
Topic
Board Development & Technical Discussion
Re: Removing OP_return limits seems like a huge mistake
by
tromp
on 12/05/2025, 06:25:47 UTC
Edit: [1] there is actually a way to require that the keys must create a spendable output, but this still allows to stuff data into these fake public keys, albeit a little bit less.

If you mean that some output bits can be set by grinding, then it would be not a little bit less, but MUCH less. What other way is there to store data in fake outputs?
Post
Topic
Board Development & Technical Discussion
Re: Removing OP_return limits is a huge mistake
by
tromp
on 01/05/2025, 16:25:25 UTC
The demand for using BTC for it's real use case, which is to move and store money, will be an S curve, and when it happens, you don't want the blockchain cluttered with jpeg spam.

The demand for bitcoin is expressed exclusively through the fees paid per byte. If jpeg encoding transactions pay more fees, then they WILL clutter the blockchain, since profit driven miners will include them. You can only slow them down slightly by making them jump through hoops like sending them directly to willing miners.
Post
Topic
Board Development & Technical Discussion
Re: Exploring Tensor-Based Proof-of-Work: Aligning Mining with AI/ML Workloads
by
tromp
on 23/02/2025, 18:15:40 UTC
verification is polynomial, solution attempt is exponential
Wrong; a solution attempt is exactly one iteration of this loop in GenerateBlock [1]:

Code:
   while (max_tries > 0 && block.nNonce < std::numeric_limits<uint32_t>::max() &&
           !CheckProofOfWork(block.GetPoWHashPrecomputed(ctx), block.nBits, chainman.GetConsensus()) &&
           !chainman.m_interrupt) {
        ++block.nNonce;
        --max_tries;
    }

which computes a single ten_hash, just as verification does. The only thing potentially exponential is doing it max_tries times.

[1] https://github.com/nf-dj/robocoin/blob/main/src/rpc/mining.cpp#L146-L151
Post
Topic
Board Development & Technical Discussion
Merits 1 from 1 user
Re: Exploring Tensor-Based Proof-of-Work: Aligning Mining with AI/ML Workloads
by
tromp
on 23/02/2025, 07:55:09 UTC
⭐ Merited by vapourminer (1)
Validation corresponds to a single inference pass in neural network terms. (which is fast)
Mining is much harder and involves millions or more of inferences passes (depending on difficulty).\
In other words, this is not a new Proof-of-work algorithm.

It's still the Hashcash POW used in Bitcoin, that computes a deterministic value for any given block header using some hash function, and compares it against a target value that's inversely proportional to difficulty.

The difference is in the hash function used for Hashcash. Instead of Bitcoin's SHA256d, this proposal uses a tensor (i.e. matrix multiplication) based hash function.

It's not an asymmetric PoW like Cuckoo Cycle or Equihash, where verification differs from (and is WAY faster than) a solution attempt.
Post
Topic
Board Altcoin Discussion
Re: Is GRIN still a thing?
by
tromp
on 18/01/2025, 10:48:54 UTC
GRouPcoin is merged-mined but I am not sure if anyone or manyones do that anymore.
What is the cumulative work achieved by GRP up till today? How long would it take all bitcoin miners to achieve that amount of cumulative work?

Post
Topic
Board Altcoin Discussion
Re: Is GRIN still a thing?
by
tromp
on 18/01/2025, 09:15:19 UTC
So we've gone from
Quote
gosh knows how many other coins that emit the same number of coins per block forever
to there's one other that may still be alive, which for lack of explorers cannot be publicly verified.
Post
Topic
Board Altcoin Discussion
Re: Is GRIN still a thing?
by
tromp
on 17/01/2025, 21:53:00 UTC
I was thinking I was for sure sure of GRP but that was what we used to try out different approaches toward making DeVCoin so actually looking at the code is probably the only way to tell for sure after all this time.
Code:
int64 static GetBlockValue(int nHeight, int64 nFees)
{
    int64 nSubsidy = 50 * COIN;

    // Subsidy is cut in half every 4 years
//    nSubsidy >>= (nHeight / 210000);

    return nSubsidy + nFees;
}
[/quote]
That is certainly the code of a fixed block reward blockchain.

[quote]
EDIT2: Alive today: https://stellar.expert/explorer/public/asset/GRP-GBHAQ252S4Z4AQOM4BWIRC3UHAOJIKCZQBUJGD336YH2O7W2NKRXMHA5
[/quote]
However I don't see what that this token on a centralized PoS chain to do with a supposed fork of bitcoin:

> Total supply:1,000,000 GRP
First transaction:2018-01-14 05:03:37 UTC
Trustlines:18 total / 15 funded
Total payments count:6
Overall payments volume:217,922 GRP
Total trades count:135
Overall traded volume:283 USD

If GRP is still running, then you should be able to point to its blockchain explorer?!
Post
Topic
Board Altcoin Discussion
Re: Is GRIN still a thing?
by
tromp
on 17/01/2025, 18:48:03 UTC
all the gosh knows how many other coins that emit the same number of coins per block forever
I don't know of any besides Grin.
Off the top of my head I can think of CoiLedCoin (CLC)
According to https://bitcointalk.org/index.php?topic=56675.0

> 10 CLC reward per block initially, halving every 525000 blocks (nominally 2 years) until it reaches 1 CLC where it will remain forever.

So the top of your head is not very trustworthy:-(
Please tell me one coin that you know for certain (as in, verified) had a fixed reward from launch on forever.
Post
Topic
Board Altcoin Discussion
Re: Is GRIN still a thing?
by
tromp
on 17/01/2025, 07:43:23 UTC
all the gosh knows how many other coins that emit the same number of coins per block forever

I don't know of any besides Grin.
Post
Topic
Board Development & Technical Discussion
Re: Efficient Blockchain Data Management
by
tromp
on 10/01/2025, 07:46:23 UTC
When you say "from scratch", do you mean that you changed assumeValid to be 0 (instead of defaultAssumeValid) ?
With "from scratch" I meant no blocks have been stored, folders blocks and chainstate were empty. I didn't touch or use assumeValid in my bitcoin.conf.
With the default assumeValid, it's not fully verifying from scratch, since it only verifies historical scripts and signatures of the last few years. You're trusting the developers' claim that all earlier scripts are valid too.
Post
Topic
Board Development & Technical Discussion
Re: Efficient Blockchain Data Management
by
tromp
on 09/01/2025, 07:33:48 UTC
finished a full IBD from scratch up-to block 796033 in about 95h

When you say "from scratch", do you mean that you changed assumeValid to be 0 (instead of defaultAssumeValid) ?