Postel's Law certainly has led to a lot of problems, but is it really responsible for protocol ossification? Isn't the problem the opposite, e.g. that middleboxes are too strict in what they accept (say only the HTTP application protocol or only the TCP and UDP transport protocols)?
Overly strict and overly liberal both lead to ossification. That's merely the observation that buggy behavior in either direction can potentially come to be relied on (or to be unpredictably forced on you, in the case of middleboxes filtering your traffic).
I'd only expect security issues to result from being overly liberal but 1. I wouldn't expect it to be very common and 2. I'm not at all convinced that's a compelling argument to reduce the robustness of an implementation.
"Overly" here refers to restrictions that exceed the relevant standard. An extensibility mechanism is useless if a nonzero fraction of the network filters out messages that make use of it in certain ways.
Ossification comes from os, ossis: bones in Latin. Turning into bones. Stops being flexible. Common behavior becomes de facto specification. There's stuff that's allowed by the specification but not expected by implementations because things have always worked like this.
It's not related to open source software. The seemingly matching prefix is coincidence :-)
Pretty much when something in the spec in theory could change, but in practice never does. So software and hardware gets built around the assumption that it never changes.
For example for networking you can have packets sent using TCP or UDP, but actually there could be any number of protocols used. But for decades it was literally only ever those two. Then when QUIC came about, they couldn't implement it at the layer it was meant to be because all the routers and software were not built to accept anything other than TCP or UDP.
There's been a bunch of thought in to how to stop this stuff like making sure anything that can change, regularly does. Or using encryption to hide everything from routers and software that might want to inspect and tamper with it.
literally it means that something is slowly turning into stone, like dinosaur bones. protocols and standard libraries suffer from this in figurative sense.
The trouble is it fails to specify what you're supposed to be liberal with.
Suppose you get a message that violates the standard. It has a length field for a subsection that would extend beyond the length of the entire message. Should you accept this message? No, burn it with fire. It explicitly violates the standard and is presumably malicious or a result of data corruption.
Now suppose you get a you don't fully understand. It's a DNS request for a SRV record but your DNS cache was written before SRV records existed. Should you accept this message? Yes. The protocol specifies how to handle arbitrary record types. The length field is standard regardless of the record type and you treat the record contents as opaque binary data. You can forward it upstream and even cache the result that comes back without any knowledge of the record format. If you reject this request because the record type is unknown, you're the baddies.
I would say the proper way to apply Postel's law is to reasonable interpretations of standards. Internet standards are just text documents written by humans and often they are underspecified or have multiple plausible interpretations. There is no IETF court, which would gives canonical interpretation (well, appropriate working group could make a revision of the standard but that is usually multi-year effort). So unless we want to break up to multiple non-interoperable implementations, each strictly adhering to their own interpretation, we should be liberal about accepting plausible interpretations.
There are many cases where the RFC is not at all ambiguous about what you're supposed to do, and then some implementation doesn't do it. What should you do in response to this?
If you accept their garbage bytes, things might seem less broken in the short term, but then every implementation is stuck working around some fool's inability to follow directions forever, and the protocol now contains an artificial ambiguity because the bytes they put there now mean both what they're supposed to mean, and also what that implementation erroneously uses them to mean, and it might not always be detectable which case it is. Which breaks things later.
Whereas if you hard reject explicit violations of the standard then things break now and the people doing the breaking are subject to complaints and required to be the ones who stop doing that, rather than having their horkage silently and permanently lower the signal to noise ratio by another increment for everyone else.
One of the main problems here is that people want to be on the side of the debate that allows them to be lazy. If the standard requires you to send X and someone doesn't want to do the work to be able to send X then they say the other side should be liberal in what they accept. If the standard requires someone to receive X and they don't want to do the work to be able to process X then they say implementations should be strict in what they accept and tack on some security rationalization to justify not implementing something mandatory and thereby break the internet for people who aren't them.
But you're correct that there is no IETF court, which is why we need something in the way of an enforcement mechanism. And what that looks like is to willingly cause trouble for the people who violate standards, instead of the other side covering for their bad code.
> If you accept their garbage bytes, things might seem less broken in the short term, but then every implementation is stuck working around some fool's inability to follow directions forever, and the protocol now contains an artificial ambiguity because the bytes they put there now mean both what they're supposed to mean, and also what that implementation erroneously uses them to mean, and it might not always be detectable which case it is. Which breaks things later.
And, if your project is on GitHub, gets your Issues page absolutely clowned on because you're choosing to do the right thing technically and the leeching whiners shitting up the Issues don't want to contribute a goddamn thing other than complaints, and they definitely don't want to go to the authors of the thing that doesn't work with your stuff and try and get that fixed either.
It's a description of how natural language is used, so what you'd expect is constant innovation, with protocols naturally developing extensions that can only be understood within local communities, even though they aren't supposed to.
Something like "this page is best viewed in Internet Explorer" as applied to HTML.