it is easy to assume that in the next few decades, we can easly achieve 10^30 / 10^40 (we've already gone past the point of cracking 2^128 or 128bits in a few seconds) and it will reach eventually 10^70+.
Are you sure? don't we start to hit the limits of the speed of light and how many atoms thin we can go on a chip, etc...
we can't just keep adding zeros like that... we hit physical limitations