ietf
[Top] [All Lists]

Re: Best practice for data encoding?

2006-06-06 14:13:48
On Jun 06 2006, at 20:55 , Hallam-Baker, Phillip wrote:

There
seem to be a lot of ad-hoc
ASN.1 decoders out there that people have written as part of
some other protocol, instead of using an off-the-shelf
compiler/encoder/decoder;

That's because most of the off the shelf compiler/encoders have historically been trash.

Of course, the interesting question is: why is that so?

Needless complexity makes it hard to build such a tool, so a great one becomes less likely to be written. Needless complexity has the interesting effect of turning off the more brilliant implementers, so implementations are more likely to be written by second-rate coders. Also, needless complexity means that the resulting tool will be heavyweight, even more so as the complexity sucks off energy that could have been put into streamlining the implementation; implementers are going to consider this in their make-or-take decisions and prefer to roll their own (and create bugs).

That's one set of reasons.
The other reason is that with some kinds of formal standardization, there is an interesting non-technical effect: People suddenly think they can make money off the standard (itself, as opposed to off the market created by the standard, which is the economic purpose of all standardization). In the implementation space, this leads to a fragmentation of efforts -- instead of one focused open-source implementation, there are likely to be multiple "commercial" ones. The more insidious effect is that suddenly, there is an incentive to add even more needless complexity to create barriers of entry to the implementation market. (Or, at least, to create enough complexity to secure a thriving consulting market for the people doing the standardization -- a kind of trapdoor function.)

Example directly pertinent to this discussion:
It has been claimed that ASN.1 PER was created to ensure the health of a company trying to dominate the ASN.1 implementation market.
The company appears to have been quite successful for a while.
(They also pretty much single-handedly killed dependent standards like H.323 in the process, not the least because their dominating implementation actually was flawed as well.)

Of course, many of us are seeing this effect all the time in all kinds of standards development, including the IETF. It has to be actively, consciously fought every day, or it will dominate. Look up Mancur Olson's "The Rise and Decline of Nations" and Carlo Cipolla's excellent summary "Basic Laws of Human Stupidity" (we are talking about type "B2" or "Bs" here) for some pertinent political/ economic theory.

Gruesse, Carsten


_______________________________________________
Ietf mailing list
Ietf(_at_)ietf(_dot_)org
https://www1.ietf.org/mailman/listinfo/ietf

<Prev in Thread] Current Thread [Next in Thread>