If N squared performance assumes a single giant transaction, why not a rule to limit the size of a transaction?
Seems like a no-brainer. I've been looking through the commits, but couldn't find it. I think classic has a defense along those lines.
Even so, a 1 Mb transaction can still take a while to validate. A hypothetical scenario described here:
https://bitcointalk.org/?topic=140078 states that a transaction of 1 Mb could take up to 3 minutes to verify. In reality, there was a roughly 1 Mb transaction that took about 25 seconds to verify described here:
http://rusty.ozlabs.org/?p=522. Anything that is over a few seconds is quite a long time in computer standards. Now, both of those scenarios are less likely to happen now since libsecp256k1 introduced significantly faster signature validation, but it is still vulnerable to such attacks. A maliciously crafted 1 Mb transaction could, in theory, still take 25 seconds or longer to verify.