Post
Topic
Board Bitcoin Discussion
Re: Is the Lightning Network centralized?
by
franky1
on 18/07/2018, 00:20:28 UTC
Magic numbers are hard coded numbers with no proper reason - like the 1 MB hard cap - this should be rather time adjusted param like you have in the halving process.

Except that no fork, to my knowledge, has implemented it in that way.  If it was algorithmically adjusted in the code, like the halving process is, you wouldn't need to hardfork every time you change the size of the blocks.  Every single fork out there in the market right now has a "magic number" (some of them just happen to be larger integers) and will need to fork again in future to change the cap on the size of blocks that are permitted.  It's effectively an endless necessity for hardforks.  

difference being.
if core done things as a consensus, and didnt delay for YEARS (even satoshi in 2010 mentioned raising it.) it could be done where the hard cap is 32mb and then the non-mining nodes algorythmacally have policy rule where the non mining nodes treat the policy as something they should consider. which then moves algorythmaccilly(without download requirement) up by 0.25mb small increments once XXX blocks are 90% near the policy limit. and so then the hard limit of 32mb changes in the background when blocks are 50% at that limit. thus giving plenty of time for people to naturally upgrade their nodes before the actual time where blocks actually reach 32mb

thus it never even gets to hit a wall and then never gets these debates occuring.
EG
imagine a 32mb consensus 2mb policy.
2018 q4: policy moves to 2.25mb without users needing to physically upgrade the nodes. as it moves algorithmically
2019 q2: policy moves to 2.5mb without users needing to physically upgrade the nodes. as it moves algorithmically

54 policy algorithm changes later (many years)
policy moves to 16mb without users needing to physically upgrade the nodes. as it moves algorithmically
the devs then release a node version where consensus.h is now 64mb. but that is still another 64 policy algorythm changes(0.25mb changes) away from users even needing to physically upgrade a node bcause theres still years between 16mb and 32mb

imagine it like 64mb consensus.h becomes hard coded in ALL future versions of core from 2024 but the blocks dont reach 32mb until 2036  thus anyone using a downloaded version from 2025-2036+ will be already set and they wont even realise it.. yes anyone in 2036 using an old 2024 version will get told their node is no longer supported.. but being 12 years out of date, its expected.. id say users of bitcoin from 2025-2036 would have upgraded before 2036 anyway. thus they would not hit a wall/debate like what happened in 2015, as it would all be pre-planned, pre implemented years before its needed

analolgy
to PC gamers.. its as simple as
if a PC game says it needs 1gb of ram do you just buy a PC with a motherboard thats max cap was 2gb.. or buy a PC with 32GB upgrad capacity and then has more than 2gb instslled or knowing you can put more ram in without needing a whol new system as and when needed.. knowing by the time a game says it needs 16gb. you can still have freedom of choice to kp on incrementing up the ram or start saving up for a MOBO that can handl 64gb ram+ knowing its years befor getting to 32mb limit