Interesting and all, but if this is decentralized, who will be in control of the web crawllers and hard drive space? The amount of not just hard drive space, but probably SSDs, servers with hundreds of gigabytes of ram needed to run a Google is astronomical. We're talking about indexing the internet.
And will it honor the do not index robot.txt?
If it doesn't that would kind of be an innovation.