The critical error made in this post is the assumption that storing, processing or transmitting a given amount of data will take a fixed amount of resources for ever. The history of technology over that last 200 years has already proven this assumption to be completely wrong.
Please elaborate.
100 bytes of data will always be 100 bytes of data, regardless of what technological advancements are made in storing, transmitting it, or processing it. All you can hope for is that these methods of managing it progress closer to the optimal over time.
The ultimate "fixed amount of resources" required to process that 100 bytes in any manner are governed by the laws of thermodynamics. So ultimately, there is a fixed amount of resource required to perform an action on that 100 bytes, and it stays in place forever, we just aren't anywhere near it.