I can do a 500gb download pretty fast.
The problem is: downloading blockchain data is not all what you have to do, to be a full node. You also have to verify that data. Verification time is much bigger bottleneck than blockchain size.
Some time ago, I saw that in practice in some altcoin. It was CPU-based, and had only 2 GB chain, or something around that. However, downloading 500 GB of Bitcoin was faster than downloading 2 GB of this altcoin, just because they switched from SHA-256 to some ASIC-resistant, CPU-mineable hash function. Even worse, they did it not only in block headers, but just everywhere, in every single place, which means they also replaced it in Merkle Tree, in Script opcodes like OP_SHA256, and in all other places. Then, I learned why using lightweight hash function like SHA-256 is important. I saw their chainwork, when mining a single block header, and getting 50 coins, required computing around 1000 hashes. I saw how validating a Merkle Tree with depth of ten levels was as hard as mining the next block.
And then, coming back to Bitcoin, you can read about possible attacks, that can slow down verification. Plugging a lot of OP_CHECKSIG opcodes, executing hash functions multiple times, sending strange P2P messages with complex data, etc. If the whole problem would be only about downloading speed, then we could go further than from 1 MB to 4 MB. However, if bootstrapping a new node from a server running 24/7, with static IP, and good connections to other nodes, can take a week, then I know the problem is not only about bandwidth, because I can see that in practice, when I read logs from my server, and compare network usage with CPU and disk usage.