ietf
[Top] [All Lists]

What do we mean when we standardize something?

2013-05-29 12:25:22
Hi.  A number of recent discussions, specifically including the
Last Calls on DKIM and standardizing RRTYPEs and even to some
extent the meeting location ones, have started me thinking that
perhaps we need to review what business the IETF is actually in
and to be sure that we actually still agree about it.  The note
that follows is an attempt to take the isolated parts (and
symptoms) of that discussion up a level and discuss it as a
general issue.

The key issues go to the very nature of standardization.  While
there are many variations on each, there are two fundamentally
different models for what standards are about.  Each is
legitimate in its own way and each has its advocates, but they
are different.  One is a focus on quality: an engineering
consensus on the best technical solution given physics or
consensus goals and boundary conditions.  The other is
sometimes called "consensus of industry practice" in which a
standard reflects what is already deployed, possibly picking
among different implemented options in a few cases but mostly
requiring that a common model appears before standards are
issued.  

Two variations on the second theme were extremely common in the
earlier days of computer and telecommunciations standardization
but are less popular today (at least in theory) due to IPR and
antitrust concerns.  Both start with a single vendor (or closed
consortium) implementation.  That implementation might then be
offered for standardization (sometimes with a "no major
changes" restriction) and adopted more generally or endorsed by
traditional SDOs.  Or that single implementation might be
reverse-engineered by others and then the result treated as
common industry practice.

Despite the occasional claim that a strange (to me) idea called
"anticipatory standardization" is a variation on the second
model, that combination makes standards in the absence of, or
prior to, implementation and deployment an impossible
contradiction.  It is basically incompatible with innovation of
any type.

The IETF has traditionally chosen the first model and a great
deal of our thinking and public rhetoric are based on it. Even
when we have adopted proposals, sometimes implemented ones,
from elsewhere, we have insisted on change control and usually
taken that seriously.  One could claim that an adaptation of
the second model would make the Internet more uniform, but a
consensus of existing practice cannot aspire to "working
better" except in the narrow sense of substitutability of
conforming products.

It is interesting that the multi-stage model of RFC 2026
(particularly the original "Proposed Standard" definition), the
IEEE's "Trial-Use" standards (the current Section 5.7 of their
Standards Board Operations Manual,
http://standards.ieee.org/develop/policies/opman/sect5.html#5.7,
makes instructive reading), and ISO "Technical Specification"
all try to combine the two models by allowing a preliminary
specification --one that is intended for trial implementation
but not deployment-- to be designed on an engineering basis and
then standardized only when implementations exist and can be
compared (our original "Draft Standard").  In theory, only when
both an industry common practice and consensus about value
emerges does true standardization (including our original
definition for "Internet Standard") move forward.  We know how
well that has worked for us in practice.  While there have been
exceptions, I don't believe it has worked much better for other
SDOs.

If we are not going to move significantly in the direction of
"consensus of industry practice", then it seems to me that we
need to be careful about the arguments that are made (or at
least those that are considered legitimate) in support of
advancement into or on the standards track.  

For example, if we agree on a relaxed registration procedure
for some protocol parameters, it is not reasonable to later
turn around and ask that that those parameters be standardized,
unchanged, because they are already registered (and maybe
deployed).  For standardization, the IETF generally needs not
only change control but an opportunity to comment on and affect
the actual design and specification of whatever is being
standardized.  

Similarly, we sometimes hear it argued that we should accept a
specification for Proposed Standard unchanged because it has
been extensively developed and reviewed elsewhere.  That may be
reasonable in some cases although I'd hope we wouldn't make it
a common practice.  But, if a specification adopted for
Proposed Standard on that basis is then proposed for
advancement to Internet Standard, I think the review should be
comprehensive --perhaps even more comprehensive than the usual
such review-- because the Internet Standard is unambiguously an
IETF product and recommendation not that of the other body.  If
we allow a specification to advance to Proposed Standard
without an open and critical review because it was developed
and reviewed comprehensively by outside expert organizations
and then allow the Last Call review for Internet Standard to be
constrained by existing deployment, we would have gone into the
rubber stamping business that we regularly say is incompatible
with our standardization model.  That doesn't mean, of course,
that changes during review for Internet Standard are likely to
be a good idea.  Those who propose changes to the spec, and
especially to the protocol, should have to convince the
community that those changes are important enough to overcome
the costs, including those of incompatibility with deployed
versions.  But I don't think we can reasonably exclude them
from raising the issues and making those arguments.

thanks,
   john