ietf
[Top] [All Lists]

Re: [Geopriv] Irregularity at the GEOPRIV Meeting at IETF 68

2007-04-20 12:46:46
(1) Deadlock is apparent but not real: the WG actually has rough
consensus, but the minority positions of a few people are being 
expressed loudly and aggressively enough to make the consensus  
less obvious and block progress.  This calls, IMO, for some     
relatively aggressive behavior on the part of WG leadership and,
if necessary, ADs.  Letting the dissenters "win" by publishing  
their approach as co-equal with the agreed-upon better technical
solution is bad for the Internet and bad for the IETF.
  
This approach may or may not work, depending on who the dissenters are.
Part of the problem with the loss of the multi-stage standards process
over the years is that the importance of "running code" has been
diminished within the IETF standard process, while in the market
place the importance of implementation remains as strong as ever. 
   
As a result, if the dissenters represent all the likely implementers, 
refusing to publish their protocol will only result in approval of a 
"bogo-standard" with little industry support, and interoperability issues 
resulting from the lack of documentation for the approach that is actually
widely implemented.  For example, would it have served the interest of
the Internet community to have suppressed the publication of the
Ethernet specification in deference to IEEE 802 SNAP encapsulation? 
While the Ethernet proponents were in the minority within IEEE 802
at the time, their approach quickly became a defacto standard. 

As a result, I do not believe that a determination of "rough
consensus" is sufficient; "running code" also needs to be taken
into account. 

(2) Deadlock is between solutions to different problems.  What  
is needed is to move beyond arguments about which problem is the
"correct" one to a clear definition of each of the problems and 
ways to determine which one is relevant to a particular case,
followed by protocol options or different protocols to select
the relevant problem and the approach that follows.

This particular case seems most relevant to the GEOPRIV WG case, where
the argument seems to be about whether location is best handled on the
host or in the network, and which protocol proposals meet "the
requirements".  Rather than attempting to find a single "best"
solution to a single set of requirements, it would probably have been
better to recognize that in fact the two approaches may not be
solving the same problem. 

(3) Deadlock is due to having competing approaches to the same 
problem with no clear technical justification for choosing one  
rather than another.  Publishing all approaches and letting the 
marketplace decide almost guarantees non-interoperable solutions
and is, IMO, a disservice to everyone involved.  The WG needs to
make a choice as to which one to recommend, even if it has to   
admit (and document) the fact that the solutions are equivalent 
and even if the choice is made at random, by delegation to a   
subcommittee or the AD, etc.   If it cannot make or agree to a 
choice, then, IMO, the WG should be shut down and _at most_
documents be published as informational descriptions of the
different approaches.

I would disagree that "flipping a coin" in this case is likely to 
provide better interoperability either in the short or long term, 
since this takes into account neither consensus nor running code. 

One of the things we learn from the history of networking is that
while competing approaches may persist in the short term, in the
long term a single winner typically emerges.  By suppressing
all approaches in the interest of "interoperability" we merely
delay the market-sorting process, prolonging the period of confusion. 

Even worse, the IETF track record does not suggest a strong ability
to pick the eventual market leader, providing a significant probability
that the IETF process will actually retard the development of
market consensus.  For example, an examination of initial track 
designations (Informational, Experimental, Standard) does not 
demonstrate a strong correlation with eventual success.  

Part of the problem here is that the intricacies of the IETF process,
while well suited to the evolution of existing protocols, does not
do a good job of developing innovative solutions to new problems. 
In such situations, the protocols most likely to succeed are those
that "only just work": proposals that are simple to implement and
include only the bare minimum functionality necessary to solve
the problem.  It is exactly these kind of proposals that are most likely
to fare poorly in the IETF review process.   For examples, see
Mark Handley's paper "Why the Internet Only Just Works":
http://www.cs.ucl.ac.uk/staff/M.Handley/papers/only-just-works.pdf

If we look at the protocols that are at the core
of much of what we do on the Internet today, many of them faced an
uphill battle in the IETF, and were only accepted once their widespread
implementation became too obvious to ignore (e.g. BGP). 

As a result, one may conclude the IETF's strength lies not in the
process for publication of WG work items, but rather in its ability
to embrace approaches developed outside the IETF, or which had
previously been sidelined within the IETF.   This is in part what
makes the Independent Submission process so important.

_______________________________________________
Ietf mailing list
Ietf(_at_)ietf(_dot_)org
https://www1.ietf.org/mailman/listinfo/ietf

<Prev in Thread] Current Thread [Next in Thread>