No. You don't get to define what we allow in the system and what we don't, certainly not when it was possible all this time. What Gavin proposed is a hacky workaround, nothing more
Setting a block size limit of 1MB was, and continues to be a hacky workaround.
Theory drives development, but in practice sometimes hacky workarounds are needed.
I write code, I'd prefer it was all perfect. I run a business which means sometimes I have to consider bottom line. If a risk is identified and a quick fix is available it makes economic sense to apply the quick fix whilst working on a more robust long term solution.
That this has not been done inevitably leads people to question why. It's the answers that have been given to those questions that are causing the most difficulty. The fact that when those answers are challenged the story changes. The fact that the answers are inconsistent with what seems logical to any reasonably minded impartial observer.
The most important thing is that until about a year ago there was near unanimous agreement on what the purpose of the block size limit was, and how it would be dealt with. yet here we are today with this action having not been taken and a group of people actively trying to convince everyone that centralised enforcement of a block size limit is somehow the natural bahaviour if the system, despite it having never been so in its entire history.
The block size limit was a hacky workaround to the expensive to validate issue. An issue that is now mitigated by other much better solutions, not least a well incentivised distributed mining economy. That is now smart enough to route around such an attack, making it prohibitively expensive to maintain.
Individual economic self interest is how Bitcoin is supposed to work.
It's time to remove the bandaid.
When the curtain is pulled back you will see how powerful the wizard really isn't.