Wow, I actually understood some of that. But only because I've
read some popular books dealing with limits of computability,
otherwise it would been all Greek to me.
Hopefully not every post will be as deep as that one especially at the end of this journey once I have the clarity of mastery of the subject matter. Even I did try to write in common language terms as much as possible but not to the n00b extreme of Randal Munroe's
Thing Explainer, lol.
I hope to eventually have some posts where I summarize without so much technobabble terminology. I am first making sure I understand the subject matter. Forcing myself to write it down, forces me to catch my mental errors. I typically like to do my learning all conceptually in my head, but I gain clarity and consistency of my thoughts when serializing them out to a written format (which is a love and hate task...hate because so slow to articulate details and I'm not a highly gifted writer...love because it advances my knowledge and is a social sharing given I am a gregarious extrovert with the capability to go introvert when programming).
So to sum it up we have seen a very in-depth exposition of
theoretical underpinnings behind one-way functions. Interesting
to see what all this is leading towards.
I assume you are aware that post is not done yet, and I still need to cover the remaining computational complexity classes. You are speaking to what I wrote thus far mostly the Determinism section and the part in the NP section about hash functions.
Getting a little bit philosophical,
More abstractly undecideability results from any rule of containment that can point to itself (c.f last two paragraphs of linked answer), because this unbounded recursion opens the generality to the universe of possibilities where the entropy is unbounded
would you agree that the "rule of containment that can point to itself" is equal to saying
"a containment so rich, that it can model all rules"
Bertrand Russell's Paradox essentially points out that a set that has the containment rule that requires it contain any sets which contain it, can't exist unless it is globally total/complete, meaning all possible sets with such rules must contain each other. Thus would require a static world that can't ever change. A static world can't exist (for numerous reasons). Russel's Paradox is similar to the paradox of perfectly dependently typed programming languages which can model all the dynamic runtime dependencies of the program in the static type checking done at compile time of the source code. Problem is that only the expected inputs and outputs are allowed. But under composition of modules, new permutations of inputs and outputs are created from the sets of the modules and these permutations weren't modeled by the modules when they were separated compiled. This is what I meant in the section where I was writing about Philip Wadler, S-modularity, the Expression Problem, and dependently typed languages. You can click the links in that section to dig more into that. For layman, I am not going to try to explain that to them from first principles such as what is a type in a programming language. That is just too much verbiage and I doubt they would benefit from it (they can always go digging with Google if they want to deepen their understanding of the technobabble terms). They need summaries.
Gödel's incompleteness theorem says that all arithmetic truths which are complete, are thus inconsistent. This is conceptually the same as Russell's Paradox in that a rule which attempts to contain everything that can itself is impossible unless it can contain everything, which can't exist.
All of this is stating that we live in a relativistic world (i.e. where each thing can only make rules about its perspective but not a total knowledge) by necessity otherwise change wouldn't exist and thus nothing would exist. Socialism (a total rule not allowing for differing perspectives) is a denial of the existence of relativity and thus doomed to catastrophic failure. I have pointed out in my writings that the omniscience required for a centralized knowledge set would require the speed-of-light to be infinite (so that all real times actions can be centrally decided without causing the local actors to pause and wait), but if the speed-of-light were infinite, then the past and the present would have no separation in spacetime and the universe would collapse into an infinitesimal point. Thus decentralization, decentralized version control, decentralized decision making, and open source are not philosophical decisions for me, rather they are harmonizing with the reality of the physics of existence which requires decentralized change and relative perspective. Note CoinCube (who has a degree in Applied Math as well as medical degree) and I had some discussions about this in the
Economic Devastation thread and he pointed out some math about how a totally undamped (i.e. underdamped in the differential equations sense of a model on oscillation) decentralization spawns mutations too fast to anneal on any optimization. This is pointing out that we still need leadership. Decentralization is positive but up to a limit of the information capacity for change. I will hope to PM CoinCube soon to see if he can come over to this thread. I doubt he is aware I created it.
But both Russell's Paradox and Gödel's incompleteness theorems afaics model containment per how it is defined. So if the test for containment is enumeration then the limitations of total containment apply to enumeration. If the definition of containment is paradigm shifted to some attribute other than enumeration, perhaps we can get more power into what can be put in an an enumerated container. By more power I mean that if an application gains from enumeration and containment can be defined orthogonal to enumeration, then the enumeration aspects can perhaps gain additional degrees-of-freedom. A new concept I am learning about is
Super-Turing computation and Hypercomputation.
In discrete computation even the reals or natural numbers are unbounded. In an analog computation described in functional limits, perhaps not so. I am not mathematical enough to speak about that until I've studied it.
The shift from discrete to analog can shift the assumptions about computation. For example (and I mentioned this in the Overview post) the quantum computing model machine breaks number theoretic intractability for problems relying on factoring by employing Shor's algorithm which (in a butchered summarization) uses parallelized (due to quantum mechanics) Fourier analysis to shift the potential factors from the time to the frequency domain where the solution space is reduced to polynomial, e.g. RSA and ECC security reduced from exponential to polynomial time.
It appears that Super-Turing computation shifts the model of inclusion from discrete to analog or distributed fuzzy logic. I need to delve into it more before I will fully understand the implications and viability. Btw, I noted the prominent leading researcher Siegelmann is apparently female.
What I expect to remain the generative essence is that any rule of containment that contains everything that can contain itself (and note I need to correct what I wrote in the post to this improved wording) that does not leak in the type of the computation model employed, would need to be so rich as to model everything that can be modeled, and thus can't exist because the world is open to unbounded composition.
Tangentially I believe
my improvement to Satoshi's design essentially hinges on the similar kind of paradigm shift where the containment test for double-spends shifted to an orthogonality via a separation-of-concerns.
That is why I warn people never to underestimate me. Who knows before this thread is done, I may have solved the open $1 million bounty problem of whether P equals NP. (I already have some ideas rolling in my head of how to paradigm shift the question. Even one of the prominents in the field pointed out that any meaningful advance would provide paradigmatic shifts of insight into complexity theory otherwise it wouldn't be interesting and useful). I encourage everyone to be creative. I don't claim any special ownership of creativity. I enjoy the creativity of others as well. Creativity comes from being in the right place at the right time (such as happening to read something that gives you a Eureka) and being busy on learning and discovering new things improves the chances/opportunities for creativity to occur.