Search content
Sort by

Showing 20 of 87 results by IShishkin
Post
Topic
Board Altcoin Discussion
Topic OP
Is ZK checkmate coming soon?
by
IShishkin
on 12/02/2025, 15:17:04 UTC
For many years Zero-Knowledge Proof systems have been considered to be a hope for solving the Blockchain Scalability Problem. Numerous ZK-projects backed by the top tier VC funds promised to compress gigabytes of blockchain data to succinct snapshots and provide the full node security to light-weight clients.   

One large smoked mirror was a Random Orackle Model. It served as a backbone for many ZKP schemes. In particular those schemes that have promised to compress blockchain history into tiny succinct snapshots, build recursive SNARKs, deliver verifiable computation, etc.   

Recent research exposes a fundamental flaw in the argument.

1) How to Prove False Statements: Practical Attacks on Fiat-Shamir
https://eprint.iacr.org/2025/118

2) How to prove false statements? (Part 1)
https://blog.cryptographyengineering.com/2025/02/04/how-to-prove-false-statements-part-1/

Do we observe the start of the foundational crisis in the scaling ZK paradigm? What are your thoughts?
Post
Topic
Board Development & Technical Discussion
Merits 11 from 5 users
Re: Nonsense about increasing the 21M supply cap
by
IShishkin
on 24/01/2025, 13:00:47 UTC
⭐ Merited by NotATether (5) ,d5000 (2) ,vapourminer (2) ,stwenhao (1) ,ABCbits (1)
As you can see, the final outcome is exactly the same. Fees are identical, amounts are identical, but the batched transaction will take less space, than non-batched two transactions, just because in the first case, you have to store Bob's data on-chain, but in the second case, Bob can just keep a proof locally, but nobody else needs that proof.

Then, instead of having weight of 560 bytes + 561 bytes = 1121 bytes, you have a single weight of 685 bytes.

You don't have to remove all data right away. You can use a similar strategy, as pruned nodes, and for example keep the full data, from the last 288 blocks. But: in the long-term scenario, you can remove in-the-middle proofs, because they are not needed for Initial Blockchain Download.

SPV proofs are not needed, to make sure, that the system is honest. They are needed only to show, who was inside. All signatures are valid in both versions: batched and non-batched. You need a valid ECDSA signature, to move any coins anywhere, no matter what.
Very well. Good job.

However, it means that the first transaction is not recorded on the blockchain before Bob submits the second transaction. It means the first transaction should stay unconfirmed in the mempool for some time and miners should be restricted from including it into the blockchain. While this transaction is unconfined, Bob is vulnerable to double spend attacks. In this setting, Bob has to trust Alice.

Bitcoin is already criticised for having long transaction confirmation time. In this scheme, first tx get confirmed long time after it was signed by Alice. I saw businesses which try to join multiple transactions into a single one. However, it is inconvenient for their clients.
Post
Topic
Board Development & Technical Discussion
Re: Nonsense about increasing the 21M supply cap
by
IShishkin
on 24/01/2025, 11:24:24 UTC
Quote
Block size, for example are necessary to be changed because they cannot fulfill the objectives of Bitcoin and remain scalable.
Why? You don't need bigger blocks, to confirm more transactions. You need more non-interactive cut-through instead, where "Alice -> Bob" transaction, and "Bob -> Charlie" transaction, will be combined by miners, and confirmed as "Alice -> Charlie" joined transaction.
Don't you think this combined transaction will occupy comparable amount of block space? It might have the same amount of computational and network data cost. You still need to propagate and verify the transaction signed by Bob. It's a hidden cost.

Cut-through actually merge multiple TX into one big TX and it leads to overall smaller TX size. The computation and network usage is a bit higher, but negligible with today's computer or server.

If network usage is negligible then why do have block size problem?
Post
Topic
Board Development & Technical Discussion
Merits 2 from 2 users
Re: Nonsense about increasing the 21M supply cap
by
IShishkin
on 24/01/2025, 11:13:45 UTC
⭐ Merited by stwenhao (1) ,vapourminer (1)
Quote
Don't you think this combined transaction will occupy comparable amount of block space?
No. Two separate transactions take more space, than a single combined transaction, when you cut some data in the middle, and you prove in signatures, that it was done correctly.

Also, amounts can be combined as well, so miners have an incentive to join transactions, because then, they can get the same fees, but use less block space, than before joining.

More than that: if you can join any two matching transactions, then you can join N transactions as well, and then, the only limits are related to things like max standard transaction size.

Quote
It might have the same amount of computational and network data cost.
There are more computations needed, but they are temporary. When joined transactions are deeply confirmed, then new nodes won't need that data during Initial Blockchain Download.

Quote
You still need to propagate and verify the transaction signed by Bob.
Only as long, as this transaction is unconfirmed. However, when it is batched into other transactions, then Bob can just keep some SPV-like proof, and every other node can forget about it.

Quote
It's a hidden cost.
What do you think is better? Having higher cost during mining recent blocks, and then lower cost during Initial Blockchain Download, or the opposite situation, where each and every full node has to verify non-batched transactions over and over again?

Very often, you have non-batched transactions in the scope of the same block. Which means, that the final outcome of the block is "consumed N inputs, and made M outputs", but it is not expressed in the simplest way.

1) Are you talking about the UTXO model?
2) It will be awesome if you support your claims with proper mathematical calculations.
3) Don't forget about network data propagation delays.
4) Remember that blockchain security relies on full nodes that do all verifications in full against all protocol rules. Full nodes never rely on any spv-like proofs. There is no compromise here on what is better. There are no "options" here to think about.
5) Finally. Could you estimate what is % of transactions go in chains "Alice->Bob->Charlie" within short time intervals compared to stand-alone transactions "Alice to Bob"? If transactions which could be "batched" are very rare, then how can you get any non-marginal efficiency improvement here?
Post
Topic
Board Development & Technical Discussion
Merits 1 from 1 user
Re: Nonsense about increasing the 21M supply cap
by
IShishkin
on 24/01/2025, 09:42:50 UTC
⭐ Merited by stwenhao (1)
Quote
Block size, for example are necessary to be changed because they cannot fulfill the objectives of Bitcoin and remain scalable.
Why? You don't need bigger blocks, to confirm more transactions. You need more non-interactive cut-through instead, where "Alice -> Bob" transaction, and "Bob -> Charlie" transaction, will be combined by miners, and confirmed as "Alice -> Charlie" joined transaction.
Don't you think this combined transaction will occupy comparable amount of block space? It might have the same amount of computational and network data cost. You still need to propagate and verify the transaction signed by Bob. It's a hidden cost.
Post
Topic
Board Development & Technical Discussion
Re: Nonsense about increasing the 21M supply cap
by
IShishkin
on 24/01/2025, 06:07:13 UTC
I agree with this concept. We talk about an alt coin. It might be merge-mined with Bitcoin. From your perspective, what fundamental characteristics of Bitcoin shouldn't be changed? What could be sacrificed?

Here is a short list:
1. 21 min supply cap
2. Max block size
3. Average block rate
4. (your version)
Immutability, censorship-resistant.

Anything that we have in Bitcoin shouldn't be changed without a good reason or a necessity. Block size, for example are necessary to be changed because they cannot fulfill the objectives of Bitcoin and remain scalable. Stealing someone else's coins wouldn't be necessary at all, and is absolutely one of the core characteristics of Bitcoin.

I agree. Can we have a tps without corrupting immutability and censorship-resistance in its pure Bitcoin meaning?

Also people might have different perspective on what is "immutability" and "censorship-resistance". Sad but true.
Post
Topic
Board Development & Technical Discussion
Re: Nonsense about increasing the 21M supply cap
by
IShishkin
on 24/01/2025, 06:00:43 UTC
This goes against the philosophy of Bitcoin. Forfeiture of assets shouldn't be an option even if it has never been touched or stayed dormant for the past century. There is also a problem with an arbitrary threshold where there isn't a good way to determine how long the coin should stay dormant before being recirculated. This would pose far too much of an issue in terms of both implementation and motivation. Even if you start to recycle OP_Return outputs, the purpose of sending them by the original owner would be defeated.

Between this and implementing a tail emission of sorts, I'd support tail emission. There shouldn't be a scenario where an asset gets confiscated because it lays dormant.
I agree, this goes against the philosophy of Bitcoin but the philosophy of Bitcoin can't remain the same because over time, everything changes. The mentality of people, the working environments, relationships, the attitude towards something and so on. As time goes, everything evolves, even the constitution changes, so Bitcoin can't stay exactly the same.

Tail Emission will make Bitcoin inflationary. I think that the ideal scenario for Bitcoin will be that it will become a popular payment method, millions of transactions will be made on bitcoin blockchain and we will significantly increase the block size (it can be dynamic too). If there will be millions of transactions in one block, Bitcoin miners will be able to get a very good reward while the transaction fees will not be ridiculously high. Problem solved, everyone will be happy.

What is your perspective on the Block Propagation Delay problem?
Post
Topic
Board Development & Technical Discussion
Re: Nonsense about increasing the 21M supply cap
by
IShishkin
on 24/01/2025, 05:49:13 UTC
I agree, this goes against the philosophy of Bitcoin but the philosophy of Bitcoin can't remain the same because over time, everything changes. The mentality of people, the working environments, relationships, the attitude towards something and so on. As time goes, everything evolves, even the constitution changes, so Bitcoin can't stay exactly the same.
Then it becomes an alt coin. There are fundamental characteristics that shouldn't be changed in Bitcoin.

Tail Emission will make Bitcoin inflationary. I think that the ideal scenario for Bitcoin will be that it will become a popular payment method, millions of transactions will be made on bitcoin blockchain and we will significantly increase the block size (it can be dynamic too). If there will be millions of transactions in one block, Bitcoin miners will be able to get a very good reward while the transaction fees will not be ridiculously high. Problem solved, everyone will be happy.
Depends on how constraint the system is; if you were to allow millions of transactions in a block now, it wouldn't always yield a higher overall block reward. The entire reward system is still dependent on both the supply and the demand. Tail emission will in theory not cause inflation, if the amount is relatively small compared to the entire coin cap.

I agree with this concept. We talk about an alt coin. It might be merge-mined with Bitcoin. From your perspective, what fundamental characteristics of Bitcoin shouldn't be changed? What could be sacrificed?

Here is a short list:
1. 21 min supply cap
2. Max block size
3. Average block rate
4. (your version)
Post
Topic
Board Altcoin Discussion
Re: Zero-Knowledge Proof technology - Is it being adopted?
by
IShishkin
on 20/10/2024, 14:51:25 UTC
It's been years now since I started hearing people building out zero-knowledge proof technology with crypto. Basic apps and script with this tech can be created really easily and I was hoping to see zero-knowledge proofs being utilized to fight spam on the internet, remove the need for captchas on the internet, verify online voting in elections, and much more cool stuff. What is holding it back?

The idea of zero-knowledge proofs for anyone not familiar with it can be easily explained with an example like this. Someone on the internet that you don't know and don't trust wants to verify that your first name is indeed "Frolo", as you tell him. You don't want to take a photo of your passport and give him, instead you upload the passport photo to an app that is open source and coded so that it can look at a passport and get the first name for anyone. After that it deletes the photo without storing any other information. It then gives that verified proof that your first name is "Frolo" to that guy on the internet and he trusts it.

Zero-knowledge proof technology in crypto space is used privacy coins or transaction with more difficult traceability.

Unfortunately, information about ZKP broadcasted by influencers in social media is often a bullshit. This information often has a form of "explanation of ZKP for dummies". Your "example" is a typical example of this misinformation. No ZKP can prove that ones name is Frolo.

The best example of ZKP is a common fraud. A fraudster (prover) decepts (prove) a victim (verifier) that his statement is true by providing irrelevant data and exploiting victims confusion.

In ZKP model "true statement" is not the one what is true but one which is accepted as a true by the victim of deception.       
Post
Topic
Board Altcoin Discussion
Re: Proof of Stake (PoS) Discussion
by
IShishkin
on 06/10/2024, 01:25:34 UTC
How does Proof of Stake (PoS) improve scalability compared to Proof of Work (PoW)?

In no way. It's a popular myth. It's a wrong question. The correct question is what is "scalability".

What are the environmental advantages of PoS over PoW, particularly in terms of energy consumption?

The same environmental advantages as fiat has over Bitcoin. Fiat currency could printed out of thin air with little to no energy consumption. But when you print fiat and buy goods with it, you indirectly stimulate environmental pollution in various ways.

How does PoS contribute to network security?

It's a wrong question.

Can a malicious attack be carried out under this consensus?

Yes. Any system could be attacked. Check the story of DAO Hack in Ethereum network in 2016 as an example.

What role do validators play in PoS, and how does staking align their incentives with the network’s stability?

They play the same role as parliament in the autocracy. Validators benefit if they keep up with the line determined the ruling party leader and could be slashed if they fall behind.
Post
Topic
Board Altcoin Discussion
Merits 1 from 1 user
Re: PoW and PoS
by
IShishkin
on 24/04/2024, 05:41:52 UTC
⭐ Merited by Kruw (1)
PoW vs. PoS should be an important topic, when reviewing a coin.

Pro and Contra Pos vs. PoW

Quite contrary to PoS, PoW is very hard or almost impossible to compromise. Not so for PoS, where more attack vectors can be found.

For example, PoS can be abused by rich stakers. Many PoS projects have a so-called pre-mining of coins allocated to the team and the team can abuse such pre-mined coins to gain influence about project decisions.
A large pre-mine is making a project centralized in many ways and devs can gain much influence by staking a large amount of PoS coins.
PoW is not having such issues.

A similar problem is called "nothing at stake", where attackers benefit from no cost to stake ETH or similar PoS coins. Should a fork occur, no matter if the fork is a friendly, accidental or a malicious attempt to rewrite history and reverse transactions, it’s a given strategy for any staker to stake on every chain. By doing so, he will get a reward no matter which fork will succeed and he will have no additional cost to do so.
In PoW, such a problem is not happening because PoW always means work (spending electricity) needs to be done. A cost occurs for every miner and he can’t mine on multiple chains because mining of a different chain means an extra cost.

Do you have more reason against / pro PoW / PoS?

PoW consensus algorithm is a backbone of a decentralised blockchain network. It's robust and unstoppable.

PoS reward distribution mechanism is a backbone of an unregistered security market. It's often used to conceal a central authority and distribute profits to shareholders in a public company.
Post
Topic
Board Altcoin Discussion
Re: What do you think will be the next hype in crypto?
by
IShishkin
on 26/12/2023, 12:02:13 UTC
I wish next hype to be around decentralisation.
Post
Topic
Board Altcoin Discussion
Re: ETH's Endgame
by
IShishkin
on 12/09/2023, 01:09:14 UTC
Ever since ETH adopted PoS, things have been going into the wrong direction. The network is as centralized as ever with a few big players controlling a large portion of ETH's supply. On top of that, the vast majority of nodes are running on top of Amazon Web Services (AWS). If developers don't do anything about this, ETH will become as bad as XRP in the future. My biggest concern is not only this, but also Ethereum creator Vitalik Buterin. The project is too tied around him. So if he dies (God forbid) or something bad happens to him, you expect ETH's market prices to go all the way down the drain in an instant. It's likely ETH will become worthless after this.

What do you think? Is the endgame for ETH approaching? Will it be possible for ETH to become decentralized again? Or is it already too late? Your input will be greatly appreciated. Thank you. Smiley

I think "the problem with AWS" is infinitesimally small compared to other centralization issues of ETH. If the network is truly decentralized, it should tolerate events when 50% of nodes go offline. Why do we even discuss this "problem"? Maybe we see a symptom of bigger problems with ETH.
Post
Topic
Board Altcoin Discussion
Re: describe your ideal altcoin
by
IShishkin
on 11/09/2023, 21:55:44 UTC
Well then since you said we should stay away from mentioning project names or coin,  this is good any ways so that some people wanting to promote they shitcoin,  won't flood the thread with of topics replies,  but then how do you intend to control the conversation here of the comment is on the topic but violates your set rules?

Since the thread is not self-moderated I bet you're going to give moderators a lot of work reporting posts most especially those that are high-quality posts.

I believe people will respect those rules. Those, who are tired of dull shitcoin advertising, will appreciate that.
Post
Topic
Board Altcoin Discussion
Topic OP
describe your ideal altcoin
by
IShishkin
on 11/09/2023, 02:29:02 UTC
I propose you to describe your ideal altcoin. Please, focus on the product and technology. Follow these rules:

1) Don't mention cryptocurrencies, project names, brands, etc.

2) Don't mention potential revenue, high returns, limited supply, deflationary economics, staking revenues, benefits to early adopters, airdrops or any other form of benefits associated with tokenomics.

3) Don't mention people behind the project and their qualities.

4) Don't mention wide adoption, exposure in media, good perception of the product, faithful community, listings on exchanges, and etc.

5) Wishful thinking is not prohibited but try to be realistic.
Post
Topic
Board Development & Technical Discussion
Re: Increasing speed of transaction with the Data Sharding Architecture?
by
IShishkin
on 25/08/2023, 05:27:58 UTC
Please do read up more on how sharding could be applicable. Good resources to look into would be articles by Vitalik Buterin, danksharding, and those stuff. These are complex topics and I wouldn't be able to give you as thorough of an explanation. Feel free to cite and raise any issues afterwards.

Could you, please, define what is "blockchain sharding" based on those good resources? Could you answer the question of the topic starter:

If Data Sharding is already on the blockchain then how do we know that it is already implemented? Is there any example of such implementation to relieve the Blockchain Burden?

Post
Topic
Board Development & Technical Discussion
Re: Increasing speed of transaction with the Data Sharding Architecture?
by
IShishkin
on 25/08/2023, 04:59:21 UTC
If Data Sharding is already on the blockchain then how do we know that it is already implemented? Is there any example of such implementation to relieve the Blockchain Burden?

What you call "data sharding" has been already implemented in Bitcoin from the day one. In Bitcoin, transactions within one block often could be processed in parallel by a node.

Yes but there's a limit to how effective the concurrency is because of Amdahl's Law.

It basically says "The performance speedup in a program is limited by the amount of time that the parallelize-able part uses".

So in this case the methods that verify the transactions inside the block can work in parallel, but there is a host of other things inside that process that can't be sped up:

- Time spent obtaining the block from a peer (even if multiple peers are queried at the same time, the code running the block retrieval runs at least once)
- Writing the block and chainstate to LevelDB
- For each UTXO in the block, rewinding the transaction history to check that it's indeed not spent
- All of the checks involved in CheckBlock and CheckBlockHeader functions must ultimately be performed in one way or another, so there will always be a minimum resource consumption associated with it.

Sharding is not going to make any of this go faster, assuming that theoretically the system had enough threads to verify a single transaction in a block per thread, if all this stuff is already running in parallel.

Yes. Notice, your argument highlights the fact that blockchain network != database
All these points are relevant to blockchain networks. Databases don't necessarily have those steps and bottlenecks. Although blockchain network has a database under the hood, it's not everything. Some people oversimplify a blockchain network to a database which could be optimised by sharding. This viewpoint is inaccurate.

If some step of the process gets optimised, it doesn't mean the whole process gets optimised, especially, if optimisation of the one part goes ar the cost of other components. 
Post
Topic
Board Development & Technical Discussion
Merits 11 from 3 users
Re: Increasing speed of transaction with the Data Sharding Architecture?
by
IShishkin
on 24/08/2023, 03:32:01 UTC
⭐ Merited by NotATether (5) ,ETFbitcoin (5) ,Who is John Galt? (1)
I was reading through a very interesting topic on Database Sharding that seems to be designed for handling huge datasets and distributing them to various channels so that single system doesn't have to overload itself.

What I understand from the concept is either you can distribute the data to various nodes in horizontal manner that is increasing the capacity of processing with more number of devices and also it has virtually no limit on how far you go.  Or one can simply start sharding vertically but in that you have to make the machine more powerful.

Such papers often confuse people. Before diving into this topic, you should understand that the database is not a blockchain network. "Database sharding" has nothing to do with "blockchain sharding". Think about this and don't get confused by plausible reasoning of prominent blockchain gurus.

Will it help in blockchain scaling?
I think the concept is pretty straight. If we are already using...

No, it doesn't work that way.

So if we do apply the horizontal sharding to the blockchain generated nodes then each node can further be divided into more data sets / nodes and thus processing could be accelerated based on more "machines" will be working on same problem.

No, it doesn't work that way. Let's put aside decentralized blockchain networks and consider original "database sharding". Database sharding is not always an improvement. It's a trade-off that is helpful only in certain cases in certain circumstances. When you split data into "shards", split processes into threads and distribute everything between separate hardware devices you might get some benefit from parallelism. However, you often get an overhead created by managing concurrent threads and associated delays. Sometimes this overhead is so huge that gains from parallelisation can't outperform it.

Therefore, database sharding is often a trade-off. It might be acceptable and it might be unacceptable. Database sharding is not a magic stick which solves all problems of databases. Neither it's not a magic stick in the context of decentralised blockchain networks. On top of that, as I said, the concept "database sharding in blockchain context" is not clear and straightforward.

If Data Sharding is already on the blockchain then how do we know that it is already implemented? Is there any example of such implementation to relieve the Blockchain Burden?

What you call "data sharding" has been already implemented in Bitcoin from the day one. In Bitcoin, transactions within one block often could be processed in parallel by a node.
Post
Topic
Board Altcoin Discussion
Re: ASIC or Anti-ASIC?
by
IShishkin
on 21/08/2023, 10:57:35 UTC
For POW coins, some people think that only anti-ASIC can be mined fairly, in order to prevent the concentration of hash rate and ensure decentralization. But other people hold the opposite view. They believe that it is ASIC that strengthens the security of coins. For CPU mining algorithms, large data centers or cloud computing vendors can easily launch 51% attacks.

Do you think anti-ASIC is necessary?

1) Full ASIC-resistance is a myth.

2) The belief that ASIC-resistance might improve decentralisation is another myth.

3) Alternative utility for mining equipment reduces the cost of the 51% attack. So it simplifies 51% attacks.

Is ASIC-resistance necessary? It depends on what is your goal.
Post
Topic
Board Altcoin Discussion
Re: Exploring an Emerging Omnichain Interoperability Protocol - LayerZero
by
IShishkin
on 13/08/2023, 15:15:12 UTC
Let me try.

1) Is LayerZero compatible with decentralized blockchain networks such as Bitcoin? What are prerequisites for compatibility of LayerZero with existing or future blockchain networks?

2) > LayerZero is a communication protocol

Who is supposed to use this protocol? Could you list all groups of participants based on their role? What data they should poses in order to be able to play their role in the network?

3) > Its core function is to ensure "valid delivery." This means the receiving chain correctly matches and verifies transactions from the sender chain, eliminating the need for middlemen.

What it means to "verify the transaction"? What verification steps can LayerZero support and what steps it can't support?

4) > The protocol implements generic messaging that provides trustless valid delivery of arbitrary data.

What means "trustless valid delivery of arbitrary data"?