That is not the correct solution because of course it gives the spammer
an asymmetrical advantage. And the problem with sharding is not just that messages between shards are multi-threading (this can actually be solved by requiring messages to be queued to the next block), but rather that then both shards have to verify the entire history chain of those cross-chard "transactions", which defeats the performance improvement of shards. Vitalik probably proposes to have shard validators trust each other with forfeitable deposits, but that like PoS destroys Nash Equilibrium. As well as I explained my video, external business logic can conflate shards even if cross-shard messages are restricted, leading to chaos, discontent when a shard validator set has lied (for profit obviously), and a drop in the value of the token.
Bruce Wanker will be laughing again.
I finally realized the solution to last sentence in the prior paragraph, which I alluded to in my prior comment. It suddenly just popped into my mind when I listened to myself.
Steem, on the other hand, easily survived the flood attacks thrown at it without disrupting service and all without any transaction fees!
Were those bandwidth DDoS attacks filtered by perimeter nodes, or validation attacks absorbed by validating nodes?
The price of GAS would go up until it stunted the growth of all three applications.
Incorrect. If the price of GAS would increase due to higher demand but the lesser amount of GAS needed would still reflect the unchanged cost of validating a script at that higher price.
The native implementation would cause all the same outputs given the same inputs, except it wouldnt know how to calculate the GAS costs because it wasnt run on the EVM.
It could simply compute its own cost based on some counters. If it knows its optimized implementation is less costly than the EVM, then it doesn't harm (i.e. remains compliant) by keeping the GAS if it is depleted before the script completes. Others verifying the depletion case would run the EVM, as this wouldn't cost them more than running native version. For non-depleted scripts, validators run their most efficient native version.
Require a proof-of-work on each script
Unless this is more expensive in resources than the cost of validating the script, then the attacker has an asymmetric DoS advantage. So all you've done is shifted the cost of paying the fee to generating the equivalent proof-of-work.
And unless each script consumer has access to a premium ASIC, then the attacker still has an asymmetric advantage. And if you say the script consumer can farm out this ASIC, then you've shifted the DoS attack to the said farm.
Local Blacklist / White list scripts, accounts, and/or peers
That is effective for bandwidth DDoS, but Nash Equilibirium can be gamed by an open system w.r.t. to submitting data for validation.