Ok, so a 10GiB block is unacceptably large. What about a 5Gib block? Or a 1GiB block? Or a 500MiB block? At some point the block will be confirmed by a large fraction of the hashing power, but not all the hashing power. The hashing power that couldn't process that gigantic block in time has effectively dropped off of the network, and is no longer contributing to the security of the network.
So repeat the process again. It's now easier to push an even bigger block through, because the remaining hashing power is now less. Maybe the hashing power has just given on on Bitcoin mining, maybe they've redirected their miners to one of the remaining pools that can process such huge blocks, either way, bit by bit the process inevitably leads to centralization.
This requires conspiration of biggest pool owners with majority of hash power. But if such conspiration is going to happen, it will result in abuse of current rules (and any updated rules) anyway. So I consider this argument a strawman. With any competition, issuing 5GiB block out of the blue takes serious risk of it becoming orphaned and Gavin's 5-second proposal supports exactly that.