Search content
Sort by

Showing 20 of 151 results by sha420hashcollision
Post
Topic
Board Development & Technical Discussion
Re: Why are all the new proposals for scripting on Bitcoin application specific?
by
sha420hashcollision
on 20/07/2024, 08:42:32 UTC

Quote
OP_CAT does not enable things like inspectoutputvalue

What about: <transactionHead> <amount> <transactionTail> OP_CAT OP_CAT


transactionHead and transactionTail are not existing op codes and it fails to be obvious how to concatenations at the end of the script should ever rationally imply to implicitly store the input or output value of the transaction even if it was.



Quote
Weak curves are utterly irrelevant
Why? Even a simple feature, like "data push" was abused. Getting elliptic curves right is not a trivial task. There are many corner cases. Also, imagine what could happen, if you could combine weak curve with SIGHASH_SINGLE bug: it can potentially allow you to move coins, without knowing the private key, for example: https://mempool.space/testnet/tx/3952b35bde53eb3f4871824f0b6b8c5ad25ca84ce83f04eb1c1d69b83ad6e448#vin=1

"Getting elliptic curves right is not a trivial task." first off most people could and probably should stick to secp256k1 anyway, the point is that theres a generic stack for generating the cryptographic protocol, once you make the choice to use a well known curve defining it correctly is just as easily as importing the correct library which is basic programming. Furthermore the generality would allow for MORE surface area as to not lock developers to this curve for VARIOUS potential reasoning. Restrictions could be made on the size of the curve and so on, but enforcing that it can only produce a certain format for covenant key ownership logic is unproductive in my opinion which is why I started this thread.


Quote
Bitcoin does not need more weakly defined hacks establishing themselves as legitimate protocols to unfortunately and inevitably leave end users holding the experimental bags.
Doing a soft-fork is a difficult task. The only reason why those hacks are created, is because if you want to deploy something in a standard way, then it can take years, and there is a huge chance for getting your proposal rejected. Which means, that those "hacks" are actually the simpler way of doing things, because it is very difficult to convince everyone, that some feature is needed. So, tricks like that are created first, and the proper implementation usually comes later, when people see, that some use case is inevitable.

AND YOU'RE ADVOCATING FOR OP_CAT SOMETHING THAT REQUIRES A (DRUMROLL PLEASE) SOFT FORK!!! So why is it suddenly so easy when its OP_CAT sir riddle me that? In reality because of how utterly necessary this will be to make Bitcoin something other than a beta project for cryptographic money, getting it into the chain should not be super difficult. Similarly to how other actual improvements went very smoothly on Bitcoin. Similarly to how OP_CAT was forked out of the chain because it was constantly misused due to its confusing implementation. If it was reconfigured old scripts with OP_CAT are now consensus invalid, not that there would be many but its an extremely non-Bitcoin-like update, that draws from nostalgia instead of practicality. If in-fact OP_CAT is re-added w different functionality it would technically be the first breaking backwards compatibility change to the network as far as I can tell. If the functionality is not changed its just as problematic as it was before and there's no sense in trying to re soft fork it back in just because a couple of hacks were found for it.

Quote
We need well defined standards that legitimately increase the surface area for innovation.
Of course. But the true answer to the question "how to do it properly", usually comes later. That was the case with public keys: people thought for years, that new address types will be based on P2SH. But then, they noticed, that public keys are needed anyway, so Taproot is more similar to P2PK, than to P2SH. Which also means, that it is more likely, to get new OP_CHECKSIG-based features, than introduce a new opcode, like OP_CHECKLAMPORT.

My proposal is that the proper approach is generalization and modularity, two features held extremely highly in all areas of software development. If you are a software developer you should recognize why these are important immediately. This can happen in the way that an OP_CODE might free you up to provide your own generator point for the curve, it might allow you to perform scalar multiplication and modulo operations wherever it is necessary in your curve point formula, not just wherever it seemed to work with the first design built for a specific kind of covenant. It can combine the features of off chain and on chain script commitments instead of locking you into one or the other. The possibilities go on. When you have CHECKLAMPORT you get one thing, Lamport signatures, and whoever doesn't want to use them can eff themselves as far as you are concerned. I see no reason Bitcoin should be that elitist about cryptographic protocols, I only see reasons for incentivizing it to flourish into a research ground for novel protocols. Failing to scale the l1 due to cryptographic elitism seems very silly. Like I said before there are several alt-chains that have this exact modularity running in practice on their mainnet today.

Quote
the alternative I'm offering is a basic standard that would support all the current proposals and more
Where is the BIP? Is it this one? https://groups.google.com/g/bitcoindev/c/Ts0FEkGvDkM
Because BIP for OP_CAT already exist: https://github.com/bitcoin/bips/blob/master/bip-0347.mediawiki

If you seriously think about your proposal, then you need to write a proper BIP. Or find someone, who will do that for you.

Also, OP_CAT is already active in some signet-like networks, and supported by some nodes. If you want to activate things differently, then you need a similar thing, to show people, that "hey, with OP_CAT, you have this, with my approach, you have that. Choose wisely".

More than that: there are more competing standards for scripting, for example this one: https://delvingbitcoin.org/t/btc-lisp-as-an-alternative-to-script/682
It contains secp256k1_muladd, but I cannot see ecdsa_muladd anywhere.

Yes I get that I need to do that, but my thread was directed at trying to identify why it's me coming up with this idea. I might try to but I'm not professing that I have the best implementation for it right now. For now Im still convinced its the best approach towards scaling Bitcoin script. Im not fully against others like lamport and cat and ctv or whatnot but i am suspicious of them being adopted before something that I find to be more general and practical. Mainly that they wont match their hype and fail to scale Bitcoin to the extent it really needs to be scaled even if it does extend Bitcoins throughput by a little bit. Furthermore, I will be attempting a BIP about this and anyone who wants to help can message me. It will probably need to be a community effort if it succeeds at all.
Post
Topic
Board Development & Technical Discussion
Merits 1 from 1 user
Re: Why are all the new proposals for scripting on Bitcoin application specific?
by
sha420hashcollision
on 13/07/2024, 21:33:30 UTC
⭐ Merited by vjudeu (1)

Quote

I forgot to mention scripts like InspectInputValue and InspectOutputValue are probably required too


That's why people are supporting OP_CAT: to not think about all corner cases, and to not activate a new soft-fork, and find out, that "hey, we forgot that one little thing, and we are stuck with what we already have".


OP_CAT does not enable things like inspectoutputvalue, this is part of my point, your philosophy is like "OP_CAT will solve all of this" but you fail to present a formal proof of that. Meanwhile formal proofs for various forms Discrete Log Curve Point cryptography are being built even with no input from Bitcoiners.


Quote

it should be up to the protocol developer to determine what curve (and protocol math) they are using


Why do you want to support every possible curve, instead of simply using DLEQ, and having secp256k1 on Bitcoin, and any other curve you want, in some other place, like LN? Do you also want to support weak curves like "p=79, n=67, base=(1,18)"? Because the consequence of having EC_MUL, is to also have weak and unsafe curves, picked by the users. And we don't want to go back to those times again: https://words.filippo.io/dispatches/parameters/


Weak curves are utterly irrelevant, people can ship malware using Bitcoin TODAY, you are simply concern trolling about the extent of potential failures in cryptographic algorithms. This kind of closed minded thinking is exactly what I'm talking about, instead of imagining what kinds of scaling protocols can be made by leveraging well researched cryptosystems, we theorize that one might build a wallet using a weak curve... as if there was no other way to ship Bitcoin wallet malware already.


Quote

That may also be used for migrating to different curves without hardforks.

You can do even more than that in the current system. You can actually deploy Lamport Signatures, without any changes at all, not only without hard-forks, but also without soft-forks as well.

I think it's incredibly misleading to suppose that because someone found a hack with known vulnerabilities still and not ready for production way to do Lamports on current Bitcoin, that suddenly that makes this implementation ideal and also means that all other crypto-systems are possible. You seem like you are arguing for halting innovation all together with this mentality. Bitcoin does not need more weakly defined hacks establishing themselves as legitimate protocols to unfortunately and inevitably leave end users holding the experimental bags. We need well defined standards that legitimately increase the surface area for innovation.


Quote

Current scaling proposals AFAIK have not considered how to tackle costing for every script that can possibly be written for their templates

The main idea is that you should spend coins by key. Then, a single signature is all you need. All features like OP_CAT, the whole TapScript support, all of those MAST-based branching, is only for disagreements (and for hidden commitments, which will be never pushed on-chain). Which means, that you should use a single signature on a daily basis, and a TapScript only, when another party stops cooperating, so you can say: "You don't want to cooperate? Well, then I am going to use OP_CAT, pay more fees, and enforce this contract on-chain!".

Yea you are describing a scenario in which you would lose to an attacker capable of fee pinning, there's no innovation there again. Sure it's extremely easy to create some contract that's tied to an off-chain script, its also incredibly easy to embed vulnerabilities in the entire process of funding and claiming from such scripts. This is asking for a lack of standardization to proliferate into vulnerabilities, the alternative I'm offering is a basic standard that would support all the current proposals and more. The lack of basic curve point functionality prevents many extremely straightforward approaches to securing these contracts and preventing pinning attacks entirely.

Post
Topic
Board Development & Technical Discussion
Merits 1 from 1 user
Re: Why are all the new proposals for scripting on Bitcoin application specific?
by
sha420hashcollision
on 07/07/2024, 21:10:06 UTC
⭐ Merited by vjudeu (1)
Quote
Doesn't that require both coins using the same elliptic curve?
1. A lot of altcoins just copy-pasted secp256k1, so it is not a big deal.
2. Read about DLEQ proofs. It is possible to prove, that the same private key was used on two completely different curves, and then execute a contract in that way.

Quote
We could have that decentralized DNS with OP_CSFS, but I cannot think of any other ways.
There are a lot of other ways. The simplest example, which is used even today, is related to vanity addresses: you have a regular Bitcoin address, but you can mine N characters in the name, and have some unique identifier. Even better: imagine what would happen, if you would mine some Silent Payment address, to avoid address reuse.

And that way is not only limited to Bitcoin: people also mine *.onion domains, in exactly the same way.

Quote
Do you understand what the OP is talking about?
As far as I know, a "basic generic Elliptic Curve Point Contract functionality" is what you will have, if you introduce OP_CHECKSIGFROMSTACK. But yes, you can have "<pubkey> <pubkey> OP_PUBKEY_ADD" or "<number> <pubkey> OP_PUBKEY_MUL" instead (or even "<numberAdd> <numberMul> <pubkey> OP_PUBKEY_MUL_WITH_ADD"). But in general, a signature is "multiplication and addition". Which means, that if you can sign any given message, then you unlock "mul and add" functionality, just by packing those two 256-bit numbers as a single signature.

Edit: By the way, you already have some "basic generic Elliptic Curve Point Contract functionality". Try this Script: "<signature> OP_SWAP OP_CHECKSIG". Then, you should push the proper public key as a solution.

Even better: try this: "OP_TOALTSTACK <signatureAlice> <signatureBob> 2 OP_FROMALTSTACK OP_DUP 2 OP_CHECKMULTISIG". Then, you just create "OP_0 <pubkey>" as a spending Script, and you have to find a key, which will match both signatures at the same time.

Edit:
Quote
There are a lot of other ways.
I guess there are more ways than people can imagine. For example, vanity addresses can be reused. However, there are things, which cannot be reused: transaction IDs. Which means, that you can just mine your transaction ID, and then share it. If anyone will change anything, then the name will be gone, and the ID will also change. It will be confirmed once or never. It will be always unique.

Also, no additional fields are needed. If you disable locktime, you can just use it as a nonce, and mine your transaction quite quickly. Other options, related to tweaking public keys or signatures, are much slower.

I'm pretty sure you are mainly discussing off chain and side-chain contracts, the scripts you provided might already serve that use case, however more complex / flexible ECC scripting conditions would enable native contracts reducing reliance on a side chain or off chain client side validated contract.

L2s can use this to build more useful UX, for example with Point Time Lock Contracts the lightning network would now be able to build lightning native privacy protocols and multi-sig custody protocols where previously the hash based contracts limit the extensibility of Lightning altogether. That's a relatively simple example, but ultimately more flexibility would open the flood gates in terms of scaling solutions.

This is in juxtaposition to each scaling protocol needing to settle with an extremely strict or at least specific standard which puts more limitations on the flexibility of the end product. I forgot to mention scripts like InspectInputValue and InspectOutputValue are probably required too (they exist on liquid already) however I think that's good because all of this is just very basic straightforward scripting that doesn't require a programmer to study a long document to figure out how it works in detail. Yes one has to study ECC if they want to build a scaling protocol but that is a well documented subject already and wont need redefinition to include into Bitcoin.

You could have PUBKEY_MUL but you could also just have basic EC_MUL I really don't see why not, it should be up to the protocol developer to determine what curve (and protocol math) they are using. That may also be used for migrating to different curves without hardforks. All you have to do is provide the correct generator point which can easily be supplied by a library that you use to build your scripts. Maybe we will never have to change the curve but if we want to be able to flexibly interact with cryptographic protocols that use other curves, this feels necessary. Or in the case where the curve is irrelevant and you simply want to use the points for any kind of cross-protocol computation, the scripts need to be completely functional as to not limit the expressiveness of the protocol.

That also means alt-chains like XMR that use a different curve might gain more compatibility in protocols like atomic swaps which might help reduce interactivity. Atomic swaps are already possible with XMR except they are unidirectional until XMR hard-forks to allow pre-signed txs. The really positive thing is you could potentially use generic curve point scripts to build privacy protocols on top of Bitcoin, maybe a bulletproof or pedersen commitment script or something like that.

I think another positive result would be that the costing of executing these scripts would be transparent, you shouldn't be able to exhaust a nodes resources by creating some kind of hidden verification loop that might go undetected in a script that comes in the form of a large expected template. Current scaling proposals AFAIK have not considered how to tackle costing for every script that can possibly be written for their templates, likely because that is a huge undertaking to consider. Without granularly applying weight to each operation it cant be done in a fair way IMO.

Post
Topic
Board Development & Technical Discussion
Merits 2 from 2 users
Topic OP
Why are all the new proposals for scripting on Bitcoin application specific?
by
sha420hashcollision
on 03/07/2024, 19:20:55 UTC
⭐ Merited by vjudeu (1) ,ABCbits (1)
Pretty much every proposal ive seen IE CTV, Vault, etc... creates a standardized structure for the contract which creates a literal cultural divide amongst programmers who prefer different scaling implementations.
Other than that you have CAT which as far as Im concerned is pitifully un-standardized and bound to create unimaginable side effects.

Currently there is no basic generic Elliptic Curve Point Contract functionality, that is a BASIC and NEEDED upgrade that should not be controversial what soever. This can easily enable any of the functionality that all above mentioned proposals seek to implement, without restricting anyone to any of the standards created by an individual proposal other than a basic standard for generic Curve Point math in Bitcoin Script.

...?
Post
Topic
Board Development & Technical Discussion
Merits 5 from 1 user
Re: ZK-proof on Bitcoin
by
sha420hashcollision
on 09/10/2023, 17:06:44 UTC
⭐ Merited by OmegaStarScream (5)
Hello! I want to say I'm very impressed with your work and I would like to lighten up the suspicion that I casted on this project a while back.
I look forward to the immense potential of these tools. Also the BitVM paper is very cool!
Post
Topic
Board Development & Technical Discussion
Re: EXTREMELY Rough Concept: Expandable UTXO space
by
sha420hashcollision
on 14/05/2023, 22:10:01 UTC
It's possible what follows contains logical mistakes.

I was toying with a similar idea where each output would be its own Utreexo. Since that's a forest of trees, an output would need to keep the roots and each root would have the amount sum of the elements in the tree. This way, we'd know the amount the Utreexo UTXO holds and can do the inflation check.
Much like Utreexo, a transaction comes with is a list of inclusion proofs [proof1, proof2,...] which gives us the inputs. A transaction also defines the outputs that are created. We check the signature and that the transaction is well balanced and then delete the inputs from the Utreexo tree and add outputs as new elements to the Utreexo.
I'm not sure I remember correctly, but I believe anyone can delete an element if they have the forest roots and the inclusion proof and anyone can add an element if they have the element and the roots. Since we have both as part of a transaction validation, anyone can update the Utreexo accumulator.

This obviously isn't compatible with Bitcoin today, but may be an interesting direction to think in. Those interested in a specific Utreexo may have the tree saved locally and could share it with others in the tree if someone lost their inclusion proofs.
It may even be permissionless to put your UTXO in any Utreexo. Simply spend a regular UTXO and add it as an element to Utreexo which should be possible because we have the forest roots for all of them.

I admit I have not studied Utreexo as much as I would like I only heard about it a month ago or so, my idea was slightly inspired by the basic idea of Utreexo which was turning outputs into commitments to outputs for scaling purposes. But as far as I can tell the Utreexo concept is more like a lite-client scaling technique and probably needs to be reinforced with EC cryptography in my opinion in order to become something that can be represented to a regular full node.

I may be incorrect but I think the issue is the merkle tree being committed to is not some kind of a tagged branch commitment scheme like taproot but more like a direct hash tree. My theory is that something with tagged commitments would be possible to reproduce without needing to store inclusion proofs similar to how taproot doesn't require you to hold all the involved public keys but bare multi-sig does.
Post
Topic
Board Development & Technical Discussion
Re: EXTREMELY Rough Concept: Expandable UTXO space
by
sha420hashcollision
on 13/05/2023, 18:42:32 UTC
So an idea like CoinPool?

https://coinpool.dev/v0.1.pdf
I never heard of it before, according to them the goal is:
Quote
CoinPool allows many users to share a UTXO and make
instant off-chain transfers inside the UTXO while allowing
withdrawals at any time without permission from other users.

My idea is basically the first part of this where many users share a UTXO, the off chain transfer idea is good but was not part of my plan. I am curious to see if this is DOA or still being developed.

So aggregated UTXOs that do not require lots of interactivity.

Post
Topic
Board Politics & Society
Topic OP
Take care of yourselves!
by
sha420hashcollision
on 10/05/2023, 00:09:53 UTC
Post
Topic
Board Mining
Re: Is anyone else glad Bitcoin is calming down?
by
sha420hashcollision
on 08/05/2023, 19:34:44 UTC
As for me, last 4 days ago I also made a Bitcoin transaction like you, the charge for what I did was also 5$, I also thought of reducing the fees, but I immediately thought that the transaction I would make might not go well so I let it go the fees are only 5$ instead of bothering me because that's exactly what happens at those times because of the nft ordinals in blockhain.

      I also found out that most of the people who tried to reduce the fees did not succeed, but instead waited for a long time before they were able to make a transaction again with high fees. That means there is no use even if the fees are reduced or adjusted.

It also took me 4 days to be able to make another transaction with higher fees. I don't use a wallet, which supports RBF service. So I had to wait till the transaction was dropped out from the mempool. Then I was in need of some money, so I had to use the high priority at that moment. Cost me more than $6.
It is getting crazy day by day. Does anyone know when this will cool down? Today's high priority transactions will cost more than 618 sat/vB.



Well it is driving price of btc downwards.  we are under 28k

I have to think they want to see how far they can drive price downwards.  So this will last at least a week or 2.

As a small fish I will be apeing any paper handed selling into the attack, however i want to again implore people to disrespect the inscription spam with ordisrespector https://minibolt.info/guide/bonus/bitcoin/ordisrespector.html#build-it-from-the-source-code
Post
Topic
Board Mining
Re: Is anyone else glad Bitcoin is calming down?
by
sha420hashcollision
on 08/05/2023, 19:33:01 UTC
It's so crowded today as well. It's been like this for maybe 3-4 days now. The transaction fee is very high at this moment. I have made a transaction with 5 sat/B, and it's been over 2 days. It's still stuck there. I don't want to spend more fees on them, so I'll just let that be there until things cools down. I don't know when that will happen, but these spike movements are making it hard.
Well nothing to do rather than waiting for it to cool down.

Make sure to run ordisrespector if you are running a node: https://minibolt.info/guide/bonus/bitcoin/ordisrespector.html#build-it-from-the-source-code
Post
Topic
Board Altcoin Discussion
Re: Altcoins dead?
by
sha420hashcollision
on 05/05/2023, 03:21:49 UTC
i think all of that coin is good, but slowly replace by new coin, and its normal any new project will take over old project who have no development, every project in industry should make a new inovation not only rely on old progress.

I think that you have little knowledge of English as well as what is required for innovation to occur on Bitcoin.
Post
Topic
Board Development & Technical Discussion
Re: What's the advantage of BRC20 compared to other Bitcoin token mechanisms?
by
sha420hashcollision
on 05/05/2023, 03:06:29 UTC
Ordinals are dead and were flawed from the start: https://twitter.com/super_testnet/status/1654212346171064328 tell your friends.
Post
Topic
Board Development & Technical Discussion
Re: What's the advantage of BRC20 compared to other Bitcoin token mechanisms?
by
sha420hashcollision
on 04/05/2023, 04:03:24 UTC
I'm a bit surprised by the success of the BRC-20 token protocol which is clogging the blockchain right now. BRC-20 seems to be a way to create and transfer fungible tokens storing a small JSON file in an Ordinal inscription.

That's how a BRC-20 inscription looks like:

Code:
{
  "p": "brc-20",
  "op": "mint",
  "tick": "soon",
  "amt": "1000"
}

Most readers of this subforum will know that there are already dozens of protocols allowing tokens on Bitcoin, the oldest probably being EPOBC (2012). Most are now based on data storage via OP_RETURN. The most popular - until BRC20 emerged - was Omni (2013) which seems to be continuosly updated and improved. And there are advanced protocols with very efficient mechanisms like RGB and Taro (which can be used for much more things than tokens).

What I'm interested in is to know if BRC-20 has any advantage respect to the OP_RETURN based systems. Yes, a JSON file seems quite "elegant" or easy to create. But it's also terribly inefficient to store a dictionary of the needed data (i.e. token type, name, quantity of the transaction etc.) as a JSON text instead of storing only the values in an OP_RETURN string, for example using Protocol Buffers. It may not be a big advantage (a dozen or two of bytes, perhaps) but technically you'll still pay higher fees occupy more space with BRC-20 than even with the simplest OP_RETURN based approaches.

Someone knows if there's any advantage to use this method? My own interpretation is that it's simply popular "because Ordinals is popular". But is there more?



PS: I've made a small correction and it may be already answering my question: It's possible that while you occupy more space on the blockchain with BRC20, the witness discount is enough to counter any real space efficiency advantage and thus leading to equal or even less fees.

The problem with these things is that they do not need to exist, you need an alternate network to verify them. So by that logic said alternate network could simply inscribe its logic into a commitment to a hash of a Bitcoin transaction and achieve cheaper, less spammy/cloggy, and more practical utility from the base layer.
Post
Topic
Board Mining
Re: Is anyone else glad Bitcoin is calming down?
by
sha420hashcollision
on 04/05/2023, 03:55:47 UTC


Lmao yes he did and it is funny you would still defend him after all of the despicable and idiotic things including this that he does and promotes.

he might have paid $623,000 for some nft in 2021 but guaranteed it's worth more than $10 even RIGHT NOW. i would pay $10 for that thing all day long and so would you!

people that bought one of his 99 originals they're probably under water but then again, i'll buy theirs for $10 too if they want.  Shocked

can you help broker that deal or not?

You would have to pay me atleast 1 bitcoin whole to do anything related to Logan Paul lol, offer is on the table.
Post
Topic
Board Altcoin Discussion
Re: Altcoins dead?
by
sha420hashcollision
on 04/05/2023, 00:31:35 UTC
No, Altcoins are not dead. Today's profit with SUI is an extra confirmation of this. Yes, undoubtedly, after the fall of LUNA, I had to think about possible investments in Altcoins. But over the past 6 months, there has been a belief in the rebirth of Altcoins power and future bull run.

To suggest you profited from this could only mean you are an insider who took money from others.
Post
Topic
Board Altcoin Discussion
Re: Altcoins dead?
by
sha420hashcollision
on 04/05/2023, 00:29:10 UTC

Your favorite altcoins are capable of losing 75% of their value within the next market cycle

Yes, Investing in cryptocurrencies carries a higher level of risk than traditional investments and other cryptocurrencies, is subject to market fluctuations and subject to significant price drops. While it is true that some altcoins may experience a significant drop in value during a market cycle as you mentioned “Your favorite altcoin may lose 75% of its value in the next market cycle”, that does not mean that they are “dead”. Altcoins can have use cases and propositions. their own unique value, and some may still be able to provide a good return on investment in the long term.

I think you are missing the goalpost here, the point of cryptocurrency is not to juggle a bunch of centralized useless hype based altcoins its to build value to the only well established decentralized network called Bitcoin.
Post
Topic
Board Mining
Re: Is anyone else glad Bitcoin is calming down?
by
sha420hashcollision
on 04/05/2023, 00:26:36 UTC
.

For the record yes Logan Paul is generally irrelevant even when he is mentioned in something and he certainly is not a blockchain wizard or educated in anything useful. However it goes to show that even a guy with a billion blind young rich followers cannot sell this absolute crap to anyone. It will die slowly or fast depending on how much money the buyers have and how many drug addictions the sellers have.

there's no way that logan paul bought an nft for $623,000 in 2021 and then today it's only worth $10, as that site claims.  i'm sure people would buy it for 3 figures just based on how much it sold for in the past. but if you're just going to sit there and complain about how it's not worth anything but not actually put it up for sale to the highest bidder then you can't say it's only worth $10. i'll buy his dumb nft for more than $10. so i'ts worth more than $10  

Quote from: bettercrypto
I also noticed that the other day, the transaction fee is quite high now if I compare it to last week. I even read that the transaction reached 38$ something out of his 180$ withdrawal.

time to sign up for western union if that's the case...i'm sure their fees for sending $180 are much reasonable than that. need a website link?  Wink

Lmao yes he did and it is funny you would still defend him after all of the despicable and idiotic things including this that he does and promotes.
Post
Topic
Board Mining
Merits 1 from 1 user
Re: Is anyone else glad Bitcoin is calming down?
by
sha420hashcollision
on 03/05/2023, 21:11:13 UTC
⭐ Merited by NotFuzzyWarm (1)
Recently, if you've been on Mempool, Bitcoin has had many transactions going through the blocks, causing high fees, I was paying about $3-5 per transaction a week ago and now I'm down to $0.2.-0.5 per transaction. Is anyone else effected by this? Miners are great and have made the 108 block backup into a 7 block backup in a week, keep going miners!

I also noticed that the other day, the transaction fee is quite high now if I compare it to last week. I even read that the transaction reached 38$ something out of his 180$ withdrawal.

    I don't really understand why the fee is so high now in bitcoin when you make a transaction that once we check in the mempool,
the fee is so expensive. I hope it goes well somehow.

There was a fee spike caused deliberately by Udi, Taproot Wizards, and their sheep followers who send them real bitcoin to get some fake ponzi token that they paid for with REALLY HIGH SAT PER BYTE FOR 1 of 2 Reasons:

1. They are using some custodial wallet participating in the coordinated spam attack on the Bitcoin base layer.

2. They are people like Udi who enjoy wasting money on fees to brag about making others pay more for casual payments.

They have also been enlisting Bots of the shiba and dogecoin pump n dump type that spam twitter everyday causing tons of newbs to get scammed as well.

Only way to avoid this is to be VOCALLY AGAINST SCAMMERS LIKE UDI AND THE TAPROOT WIZARDS
Post
Topic
Board Development & Technical Discussion
Re: EXTREMELY Rough Concept: Expandable UTXO space
by
sha420hashcollision
on 03/05/2023, 07:19:15 UTC
Note: merge all of your consecutive replies into one post.

By creating a UTXO that aggregates other UTXOs, you would be able to greatly reduce the size of the UTXO set on the blockchain. This could potentially lead to faster and more efficient transactions.
However, this approach may lead to increased complexity in the verification process. In order to verify that a UTXO is valid and can be spent, the blockchain would need to verify not only the inclusion proof of the aggregate UTXO, but also the inclusion proofs of all of the individual UTXOs that make up the aggregate UTXO.
Can you make up a solution to this problem?

All a UTXO is, is a pair of a hex string (in this case, of a 32-byte transaction hash) and an integer which denotes output number (which usually can be represented in just 1 byte with an unsigned uint8_t, though even if that overflows, a 2-byte uint16_t will definitely be enough. Either way, its a numbers problem of other areas.

Suppose you have billions of UTXOs - this will equate to several GB of UTXO data. You can't simply just compress the UTXO set as that will only delay the inevitable.

So perhaps, a shredding algorithm could be implemented, where the oldest set of UTXOs past a particular threshold are ruthlessly shredded from the UTXO set (even if it represents thousands of bitcoins), and spending such a UTXO would require any node to scan for it from the beginning of the blockchain - in other words, we don't cache extremely old UTXOs.

And in case blockchain culling is also implemented - where instead of the Genesis block and the first X thousand blocks, you have a "coinbase transaction" of output UTXOs from those X blocks - there still isn't any fear of UTXOs being wiped out of the blockchain because they would still be there.

I think what you are suggesting is a commitment based pruning of very old utxos which I think might be efficient for things that are not considered full nodes but I think would be controversial to push into a full node. My idea may be controversial also but not for the reason that it would require old UTXO owners to maintain their chainstate, but more-so because it implies a new convolution in the way that an output can be spent. However I want to be clear my idea does not include altering the way old chainstate is processed, it would only affect blocks that adopt it.

Edit: Although on second thought, I think there could be a way this model is used to scale old chainstate at the discretion of the node operator

Pushing it to Bitcoin Core? Oh yeah, most features proposed for it are definitely controversial. That's why a new full node client with these features could be created instead, bypassing the Core dev status quo. And, if the full node is any good, new people who want to get into the network will run this new pruning full node, while the old guard continues to run Bitcoin Core to avoid permanently losing old blocks.

Even a 10:1 ratio of these new nodes against Bitcoin Core nodes would be good for the network as it would have the adoption equivalent to L2 layers, and even these L2 features can be incorporated directly inside the node.

Generally speaking though a one way compression to the UTXO set would likely be considered a lite client not a full node regardless of if it was in Core or not. My feature ideally would be full node compatible.
Post
Topic
Board Development & Technical Discussion
Re: EXTREMELY Rough Concept: Expandable UTXO space
by
sha420hashcollision
on 03/05/2023, 07:15:19 UTC
Note: merge all of your consecutive replies into one post.

By creating a UTXO that aggregates other UTXOs, you would be able to greatly reduce the size of the UTXO set on the blockchain. This could potentially lead to faster and more efficient transactions.
However, this approach may lead to increased complexity in the verification process. In order to verify that a UTXO is valid and can be spent, the blockchain would need to verify not only the inclusion proof of the aggregate UTXO, but also the inclusion proofs of all of the individual UTXOs that make up the aggregate UTXO.
Can you make up a solution to this problem?

All a UTXO is, is a pair of a hex string (in this case, of a 32-byte transaction hash) and an integer which denotes output number (which usually can be represented in just 1 byte with an unsigned uint8_t, though even if that overflows, a 2-byte uint16_t will definitely be enough. Either way, its a numbers problem of other areas.

Suppose you have billions of UTXOs - this will equate to several GB of UTXO data. You can't simply just compress the UTXO set as that will only delay the inevitable.

So perhaps, a shredding algorithm could be implemented, where the oldest set of UTXOs past a particular threshold are ruthlessly shredded from the UTXO set (even if it represents thousands of bitcoins), and spending such a UTXO would require any node to scan for it from the beginning of the blockchain - in other words, we don't cache extremely old UTXOs.

And in case blockchain culling is also implemented - where instead of the Genesis block and the first X thousand blocks, you have a "coinbase transaction" of output UTXOs from those X blocks - there still isn't any fear of UTXOs being wiped out of the blockchain because they would still be there.

I think what you are suggesting is a commitment based pruning of very old utxos which I think might be efficient for things that are not considered full nodes but I think would be controversial to push into a full node. My idea may be controversial also but not for the reason that it would require old UTXO owners to maintain their chainstate, but more-so because it implies a new convolution in the way that an output can be spent. However I want to be clear my idea does not include altering the way old chainstate is processed, it would only affect blocks that adopt it.

Edit: Although on second thought, I think there could be a way this model is used to scale old chainstate at the discretion of the node operator

Pushing it to Bitcoin Core? Oh yeah, most features proposed for it are definitely controversial. That's why a new full node client with these features could be created instead, bypassing the Core dev status quo. And, if the full node is any good, new people who want to get into the network will run this new pruning full node, while the old guard continues to run Bitcoin Core to avoid permanently losing old blocks.

Even a 10:1 ratio of these new nodes against Bitcoin Core nodes would be good for the network as it would have the adoption equivalent to L2 layers, and even these L2 features can be incorporated directly inside the node.

I would argue that if it is sound and an improvement it would be merged into core eventually.