Post
Topic
Board Altcoin Discussion
Re: Decrits: The 99%+ attack-proof coin
by
Etlase2
on 06/05/2013, 02:08:00 UTC
You are building quite a complicated scheme to to avoid a hardware-upgrade race which I'm not convinced that it really works the way you intend it to (ASICs compete with ASICs, GPU with GPU). Being engineer I tend to believe in simple solutions. Complicated solutions usually do not behave well in unforeseen situations. Let them build ASICs. Let them adopt whatever new technology. Just make sure that the energy use stays at a reasonable level.

It really isn't that complicated when you're talking about devising an efficient system to create currency. The intent isn't that ASICs compete with ASICs and GPUs with GPUs, the intent is to make it so that ASICs are totally, absolutely unprofitable compared to sunk hardware costs. Why? Because as your thread eloquently points out, energy consumption does not depend on efficiency. Why encourage the development of ASICs which will take at least hundreds of thousands, if not millions of dollars in resources wasted to build the machines, and a shelf-life of who knows, maybe 5 years before it enters the trash as a completely useless piece of silicon? It isn't just electricity that is consumed in the process of protecting bitcoin or creating money, it is resources as well, and the only fair way to account for both is how much money is spent.

Let me propose a real-world scenario. I don't have an exact idea of how the numbers will work yet or how much ASICs will cost to produce, so bear with me.

Let's say the total number of decrits is 10 million and the network has seen a big increase in usage that requires 10 million more decrits to bring the price back to its oscillating point (its current trading value is double its cost to create). Let's say the MB award is 50k decrits. That means at the first block, 550k decrits are produced, the second 650k, 750k... 1450k (based on increasing multipliers described here). All in all, minters create 500k decrits while 9.5 million are given away.

If sunk hardware costs of GPUs are creating these decrits, we need only worry about the electricity cost. If a coin sells for $2 and costs $1 in electricity to create, the GPU minters will profit $50k on the first run while the network profits $1000k (free money). Now decrits only sell for 2*10m=x*10.55m or $1.89. In the next block, minters profit $44.5k while the network profits $1,229k. As you can see, profits for minters start dropping while profits for the network increase. The amount over the 10 blocks that minters make in profit should average around $250k (actually less, because of the burn) while the network profited $9.5m. A total of $10 million of value was transferred to the network at the cost of $250k in electricity. Now it's over and the price is stable again near its cost to produce. (Energy usage = 0)

We'll say that 25,000 GPU minters were involved in creating these coins because the award for each person in the MBQ is around 2 decrits (because the block is expected to go around 7-10 days to use a dollar or two of electricity). If 100x faster ASICs cost $1000 and use no electricity, 250 ASICs cost $250k, for the same net profit of $250k. If 10x faster ASICs cost $500, massive net loss. But of course ASICs do use electricity, and plenty of it, and competition will simply raise the difficulty to reduce their profitability.

Now because of the whole "ASICs vs ASICs and GPUs vs GPUs" that I talked about in another post that you referenced, GPUs will still be able to make a profit while the ASICs are doing this, lowering their profitability. ASICs can take the brunt of the more expensive, early burn, while GPUs hop on for cheap. If enough of them do so, the profitability of ASICs is further reduced. This is a "GPU defense" mechanism for the attempt to upset the balance required to produce new money. The point is to make specialized hardware unprofitable so that the resources will not be wasted to make it, because while a GPU's cost can be amortized over its use as a vital computer component, an ASIC has one job before hitting the trash can. A total, absolute waste.

Quote
Giving 10x the award to somebody else than the miner is just a leverage ratio to energy use but it scales all the same. But I see you propose to adjust this ratio according to the number of transactions. Can you provide more info on this adjustment algorithm? How can you make sure that energy use stays at reasonable level?

It doesn't scale all the same, because the cost to produce decrits remains unchanged. Bitcoins cost to produce will forever rise to meet a profitability margin (assuming the network expands), and bitcoins must always be produced to secure the network (producing being new currency or tx fees, as the result is identical). The only way to lower bitcoin's energy cost is to reduce transaction fees as the network expands, which requires the people who will profit most to agree to do this, as well as fostering centralization--less profit, less people interested in securing the network. And there is no steady state. When decrits cost about as much as they sell for, no one creates decrits. With bitcoin, coins must constantly be produced or the security of the network is at risk.

The energy usage of decrits is quickly bound by profitability. The longer minters mint, the less profit they can possibly make. And the vast majority of the overall profit of the system goes to the network transferring value from fiat or other forms of wealth into decrits. Once coins are created, they are never "re-energized" again such as is the case with bitcoin. Once there are enough decrits to suit the population that wants to use decrits, the energy use of the system borders on zero--only tx validation and costs of running a node. Bitcoin borders on using enough energy to re-energize all new currency and all tx fees each block at a profit. One system tends to a zero state while one system tends to an extremely expensive, wasteful state.