subject matter expert consensus and basic computer science can adjudicate between the commitment data utility and the data availability utility. This is exactly how we came up with the 40 then 80 byte limit. and we should do this again.
That was in a climate where miners were not allowing direct submission to bypass these restrictions.
Subject matter understanding was broadly always that his sort of limit was an unstable equilibrium that wouldn't be sustained as mining transitioned from being subsidy driven towards being fee driven.
In 2014 public bitcoin understanding was often far less sophisticated than today, people often wanted to stuff data in where their goals exactly how they express them would be much better off for their _own goals_ if they included a hash (or using OTS, if it had existed) or doing something else entirely. And there was far less understanding about the downsides and potential harms of (ab)using the Bitcoin system as 'data storage'. Back then when I was defending the limitation on online people were so mad about it they were making threats of violence against me and in many discussions I stood alone in defending limiting it. The world is very different now.
In general, all protocols should want to limit the byte size.
And all have a significant incentive to do so that can't be escaped, in that the resources available are limited and that there is enough traffic to create meaningful fees. This is something that was far less that case at the time this non-standardness policy was initially set, in fact the minimum feerate has increased by a factor of 171 in real terms since then. Blocks at the time contained only an average of 16% of their limit and so there was no market produced level of fees-- absent non-consensus rules it would have cost literally nothing to stuff in lots of trash. Today blocks are consistently full capacity.