I frequently get into this argument with people about how Postel's law is misguided. Being liberal in what you accept comes at _huge_ costs to the entire ecosystem and there are much better ways to design flexibility into protocols.
> Being liberal in what you accept comes at _huge_ costs to the entire ecosyste
Why do you believe that?
Being liberal in what you accept doesn't mean you can't do input validation or you're forced to pass through unsupported parameters.
It's pretty obvious you validate the input that is relevant to your own case, you do not throw errors if you stumble upon input parameters you don't support, and then you ignore the irrelevant fields.
The law is "be conservative in what you send, be liberal in what you accept". The first one is pretty obvious.
How do you add cost to the entire ecosystem by only using the fields you need to use?
The problem with Postel's law is that people apply it to interpreting Postel's law. They read it as encouraging you to accept any input, and trying to continue in the face of nonsense. They accept malformed input & attempt to make sense of it, instead of rejecting it because the fields they care about are malformed. Then the users depend on that behavior, and it ossifies. The system becomes brittle & difficult to change.
I like to call it the "hardness principle". It makes your system take longer to break, but when it does it's more damaging than it would have been if you'd rejected malformed input in the first place.
> They accept malformed input & attempt to make sense of it, instead of rejecting it because the fields they care about are malformed.
I don't think that's true at all. The whole point of the law is that your interfaces should be robust, and still accept input that might be nonconforming in some way but still be possible to validate.
The principle still states that if you cannot validate input, you should not accept it.
The state of HTML parsing should convince you that if you follow postel's law in one browser then every other browser has to follow it in the same way.
That's a truism in general. If you're liberal in what you accept, then the allowances you make effectively become part of your protocol specification; and if you hope for interoperability, then everyone has to be follow the same protocol specification which now has to include all of those unofficial allowances you (and other implementors) have paved the road to hell with. If that's not the case, then you don't really have compatible services, you just have services that coincidentally happen to work the same way sometimes, and fail other times in possibly spectacular ways.
I have always been a proponent for the exact opposite of Postel's law: If it's important for a service to be accommodating in what it accepts, then those accommodations should be explicit in the written spec. Services MUST NOT be liberal in what they accept; they should start from the position of accepting nothing at all, and then only begrudgingly accept inputs the spec tells them they have to, and never more than that.
HTML eventually found its way there after wandering blindly in the wilderness for a decade and dragging all of us behind it kicking and screaming the entire time; but at least it got there in the end.
> The state of HTML parsing should convince you that if you follow postel's law in one browser then every other browser has to follow it in the same way.
No. Your claim expresses a critical misunderstanding of the principle. It's desirable that a browser should be robust to support broken but still perfectly parceable HTML. Otherwise, it fails to be even useable when dealing with anything but perfectly compliant documents, which mind you means absolutely none whatsoever.
But just because a browser supports broken documents, that doesn't make them less broken. It just means that the severity of the issue is downgraded, and users of said browser have one less reason to migrate.
The reason the internet consists of 99% broken html is that all browsers accept that broken html.
If browsers had conformed to a rigid specification and only accepted valid input from the start, then people wouldn't have produced all that broken html and we wouldn't be in this mess that we are in now.
There are different interpretations about what "being liberal" means.
For example, some JSON parsers extend the language to accept comments and trailing commas. That is not a change that creates vulnerability.
Other parsers extend the language by accepting duplicated keys and disambiguate them with some random rule. That is is a vulnerability factory.
Being flexible by creating a well defined superlanguage is completely different from doing it with an ill-defined one that depends on heuristics and implementation details to be evaluated.
My counter argument is that the entire web exists because of Postel's law. HTML would just be another obsolete boring document format from the 1980s.
I agree that there are better ways to design flexibility into protocols but that requires effort, forethought, and most of all imagination. You might not imagine that your little scientific document format would eventually become the world's largest application platform and plan accordingly.