What I am saying is that the entropy of your problem space is large but limited, which is indeed because the confusion and diffusion injected into the memory space is not entirely randomized over the entire memory space allocated to the PoW computation. Duh. Which is precisely what Andersen discovered when he broke your Cuckoo Cycle as I warned you would be the case. Quoting from the above paper:
You're living in the past quoting 2 papers that both focus on an early 2014 version of Cuckoo Cycle.
What David Andersen did in march 2014 is to show how to reduce memory consumption by a an order of magnitude, which has become part of the reference miner since april 2014, well in the past when my Cuckoo Cycle paper was published in the January 2015 BITCOIN workshop. Please focus your criticism on that version.
But you don't need to read that paper to learn of the linear time memory trade off, which is right on the project page:
"I claim that trading off memory for running time, as implemented in tomato_miner.h, incurs at least one order of magnitude extra slowdown".
Btw, there is maximum entropy in the bitmap of alive edges once 50% of them have been eliminated.