Hallam-Baker, Phillip said:
for some, "let the market decide" is a religious statement. it's
generally based on an unexamined faith in market conditions as an
effective way of making a good choice among competing technologies.
I don't accept the ideological case for or against free markets.
My point here is limited to the case in which a working group is unable
to come to consensus over two disjoint proposals and each group refuses
to compromise with the other.
Agreement would certainly be the best outcome in the case where the
differences are due to personality issues. But there are also cases
where one proposal is in fact distinctly inferior, usually because the
adherents are bought into some obsolete dogma or other.
For example there is no way to negotiate a compromise between die hard
adherents to the end-to-end security primciple and proposnents of an
edge based security system. The two architectural views are entirely
incompatible and cannot possibly be reconciled.
actually I disagree with you on that point (I agree with most of the rest
of the above). there is at least sometimes a role for perimeter-based
security in addition to end-to-end security. the trick is to understand
the strengths and weaknesses of each as they apply to a realistic threat
model, not to assume a priori that either one can exclusively do the job.
in my experience the disagreement between die hard adherents usually
amounts to an underlying disagreement about the threat model - where both
sides may have oversimplified it.
What I have observed in these divisions is that it is actually quite
rare to have two factions of implementers. What is much more common is
that you have a group of folk who are building something and another
group of rock throwers who won't build much more than a bunch of
prototype code that only worksw with their own system.
yes, but often this is because it's much easier to build something that
implements a naively simple version of a protocol than to design and
implement a protocol that will actually work well in a realistic range of
scenarios that will be encountered in wide-scale deployment. that's why
"running code" by itself isn't worth much anymore.
In other words letting the market decide comes down to who has the best
code and the best deployment strategy.
depends on what you mean by "best". if you mean the strategy that gets a
lousy product out into the market in the shortest amount of time and
attempts to lock in customers, you're right. if you mean the strategy
that provides the most long-term benefit to the community, you're wrong.
there's no substitute for engineering.
we would never consider building a bridge, building, ship or large
aircraft, without careful understanding of the problem to be solved,
multiple design/analysis/feedback cycles, etc. the investment in these is
quite obviously so great that we want to minimize the potential to invest
that much in a poor design. and yet protocol designers will happly invest
similar sums - or even more - of their customers' money in poor designs.
of course, sooner or later the market will probably figure out that
investing money in half-baked protocols or implementations of those
protocols is a poor idea. the market does learn, it just takes so long to
do so that its errors are very expensive for everyone.
Ietf mailing list