subject matter expert consensus and basic computer science can adjudicate between the commitment data utility and the data availability utility. This is exactly how we came up with the 40 then 80 byte limit. and we should do this again.
That was in a climate where miners were not allowing direct submission to bypass these restrictions. In my view that fact is what shifts the limit from being no longer necessary but of no significant harm to somewhat harmful.
Subject matter expert understanding was broadly always that this sort of limit was an unstable equilibrium that wouldn't be sustained as mining transitioned from being subsidy driven towards being fee driven.
In 2014 public bitcoin understanding was often far less sophisticated than today, people often wanted to stuff data in where they would be better off according to their _own goals_ if they included a hash (or using OTS, if it had existed) or doing something else entirely. And there was far less understanding about the downsides and potential harms of (ab)using the Bitcoin system as 'data storage'. Back then when I was defending the limitation on online people were so mad about it they were making threats of violence against me and in many discussions I stood alone in defending limiting it. The world is very different now. There are also plenty of less expensive alternatives for many things people wanted to stuff data in for, ifps, nostr, other blockchains. What remains today are things where people do rationally judge that
they benefit from putting the data in bitcoin, so much so that they are willing to do so at significant expense and that is much harder to dissuade.
And the concept of generally appointing people to judge how others are using Bitcoin is repulsive to the premise that Satoshi set out for the system-- Bitcoin shouldn't have and doesn't need "experts" passing judgement on other people's ability to transact. Fortunately, Satoshi designed a system where the ability to do so is fairly limited and fragile: for better or worse. There isn't such thing as a freedom that doesn't have its negatives. And the fact that the ability to filter stuff that is willing to pay is limited in Bitcoin is among them. Moreover, data embedding can hide as indistinguishable from other uses -- in particular by shoving their data as fake addresses in outputs, which is hardly any worse for the embedder, but much worse for Bitcoin and just not blockable through any amount of expert judgement short of stuff like only allowing Bitcoin to be sent to approved, whitelisted, kyced, addresses!

So sure, we can say that stuffing in outside data isn't a restriction on people's freedom to actually transact, but as far as we know there is no way to eliminate the ability to (ab)use bitcoin for outside data that is willing to pay which isn't equally (in)effective against transactions.
Please don't mistake this point as an argument that "blocking monkey jpegs is CeNSoRShIP!". While I wouldn't *completely* dismiss the merits of that argument, it's weak at best and entirely not my point: My point is that any tool that DOES block monkey jpegs would be no less effectual for some state mandated blacklist. The fact that filtering has limited effectiveness is *good news*, and we would be very foolish to muddy the waters by playing an extended game of wack-a-mole that gives anyone any ideas, that writes out a roadmap for actually censoring transactions to whatever extent they can be, or which sticks bitcoin developers or miners in a position of being punished for failing to implement or make effective some form of censorship that various powerful entities demand. Even though such actual censorship would ultimately be ineffectual that doesn't mean it couldn't cause tremendous harm.
In general, all protocols should want to limit the byte size.
And all have a significant incentive to do so that can't be escaped, in that the resources available are limited and that there is enough traffic to create meaningful fees. This is something that was far less that case at the time this non-standardness policy was initially set, in fact the minimum feerate has increased by a factor of 171 in real terms since then. Blocks at the time contained only an average of 16% of their limit and so there was no market produced level of fees-- absent non-consensus rules it would have cost literally nothing to stuff in lots of trash. Today blocks are consistently full capacity.
When the limit was established Bitcoin wouldn't even have block file pruning for another year, so you couldn't even participate without keeping every piece of junk shoved in the chain on your drive. Today there are options designed and more or less implemented (though not included in Bitcoin Core) that let people bring up a node without ever downloading historical prunable data at all. So functional capacity limits moderate the data storage (ab)use and ultimately the answer may be to just not care about it at all because you're not storing it and you don't have to download the past stuff to bring up a node. Obviously not validating every bit of history has its own cost (though there are ZKP proposals that might even eliminate those but they're still very theoretical), it is an *actual* solution to much of the data embedding harms and it's one that doesn't require building ineffectual censorware or get mooted by the embedders disguising their data.