Keith,
These days, for a protocol specification to be of "reasonable use" on a
wide scale it needs to avoid causing harm.
First, something can be of reasonable use while still causing harm.
Fossil based fuels prove that.
Interesting analogy. Though I'll note that there are different approaches
to the use of fossil fuels in different parts of the world, one of which
is "use as much as you can afford and we'll do our best to keep prices low
for the whole country" and another is "tax the worst wastage of fossil
fuels and use the taxes to fund public transportation which has less impact
on the environment", and the latter is at least arguably saner.
In other words, if some practice causes harm when that practice is
widespread, then _some_ kind of caution or restraint might be in order.
Pretending that the problem doesn't exist doesn't seem wise.
And while I agree that there are certain
areas where causing harm to others needs to be considered (such as
UDP-based protocols that lack well known congestion avoidance
algorithms), we as a community cannot be so risk averse that we drive
development elsewhere.
I disagree with the statement as writen, but more importantly, with the
implications of the statement. IETF cannot possibly be a home for all
protocol development on the Internet. There are simply too many protocols.
After all the Internet is supposed to be a general-purpose communications
framework that can support an arbitrary number of applications.
So a lot of protocol development will always be done outside of IETF,
and this is a Good Sign. If we ever get to the point that protocol
development can only be done within IETF (perhaps because the network
imposes so many constraints that IETF is the only place they can be
sorted out), we will have failed miserably.
Furthermore there will always be vendors who prefer to "let the market
decide" which protocol will be used than to "let IETF decide" for reasons
that have nothing to do with IETF being too risk averse. Any vendor which
has a substantial lead in market share will in general prefer to deploy
first and later insist that IETF (or other standards-making body) either
rubber-stamp or fix the bugs in its "de facto" standard - thus ensuring
that they keep the competition in catch-up mode. There's nothing new
about that, it's been happening for years.
I'd also argue that much of the value that IETF adds (if it is doing its
job right) consists of risk minimization - specifically minimizing the
risk that customers who invest in deploying IETF standards will have
substantial incentive to have to completely discard that investment
within a short time.
Now, clearly there's such a thing as being too risk averse, but I don't
think that's our main problem. I am more concerned that we don't pay
attention to some substantial risks until very late in our processes -
when it's too late to fix the problems.
Similarly, SOCKS went quite far before the IETF ever got a look at it.
Why? Because we are no longer viewed as a place where development can
seriously take place. Risk averse.
Maybe the problem is not just being risk averse, but our development style.
Something tells me that having random, unstructured conversations for
months on end and gaining consensus by exhaustion are not good ways
to do protocol development. And only after the WG is exhausted do we
do any serious external review. But it's not fair to blame the reviewers
for being too risk averse when the WG failed to pay attention to those
risks in the first place.
Now the *perception* might be that we're too risk averse, just because
we take so long to get protocols out the door. But while there are
lots of ways to speed up the process, to simply be less risk adverse is
to fail to do our jobs.
You know that thing about running
code? Taken too far we fail what I think part of our mission is, which
is to be a place to collaborate, because everyone will have shown up
with their respective running code, only to fight over whose running
code (if anybody's) will become the standard. See, for instance, the
XMPP/IMPP wars.
Those wars would have cropped up anywhere else that didn't simply
cave in to pressure from one faction. You can't force people to agree,
particularly when they think they gain a competitive advantage from
not agreeing.
Publishing crap dilutes the value of the RFC series, and makes it more
difficult for the public to recognize the good work that IETF does. It
also costs money which could be better put to other uses.
This was never the series' intent.
Face it, there's no way that the RFC series could have continued to be
the Engineering Notebook for the Internet for all this time. The Internet
has long since become too large and too diverse. The RFC series structure
could never have accommodated it. Maybe something like a Wiki, but even
that's a stretch. For that matter, no matter what the structure, there
would always be people who didn't want to go along with that. Heck, we
can't even get everyone to agree on the structure of domain names, and
those are far, far simpler and with obvious advantages to uniformity.
The truth is that we've long since abandoned the original intent of the
RFC series, and appropriately so. Except as a historical note the original
intent is scarcely relevant now.
We've attempted to warp it into
this, and the result has been The Official Dogma, with a corresponding
lack of development within the IETF.
Oh, BS. The change in the RFC series hasn't hindered development a whit.
If you don't want to jump through the minimal hoops required to get your
protocol published as Informational or Experimental, there are plenty of
other ways to publicly document your protocol.
If we want to allow for REAL
innovation WITHIN the IETF, then you have to let some crap through, and
you have to trust the RFC Editor and others to hold the bar at some level.
non sequitor.
--
Power corrupts; Powerpoint corrupts absolutely. - Vint Cerf