ietf
[Top] [All Lists]

Re: Last Call: <draft-housley-two-maturity-levels-08.txt> (Reducing the Standards Track to Two Maturity Levels) to BCP

2011-08-03 16:20:32
I appreciate this exchange here. I have a better idea of the draft and your intention I have a few comments.

What I have noticed of late are fast track RFCs are coming out of no where, very fast and sometimes are indirectly related to a WG but not a WG chartered work item, and it may have an unforeseen influence in the WG end results. While not all WGs are the same, this is what I experienced in the DKIM WG. The problem comes when there is little or isolating vetting to this external work.

For the most part, once work gets an RFC or even just an published I-D, it also comes with a complex to be labeled as a "standard" to "throw the book" at others; "you are not following the standard." While they are technically wrong, it happens and it puts others in defensive position. But it also shows why there are derivative work or why an RFC does not get followed 100%.

I guess, if anything, if we are going to allow for faster maturity, we probably need some guidelines (if not already in place) in how non-WG RFC productions could influence a current WG.

--
Hector Santos, CTO
http://www.santronics.com
http://santronics.blogspot.com

Russ Housley wrote:
SM:

From Section 2.1:

"no existing published requirements are relaxed".

Are these published requirements BCPs?

Yes.


From Section 2.2:

 'This maturity level is a merger of Draft Standard and Standard as
  specified in RFC 2026 [1].  The chosen name avoids confusion between
  "Draft Standard" and "Internet-Draft".'

Shouldn't that be "Internet Standard" instead of "Standard"?

Look at RFC 2026.  The three level defined there are Proposed Standard, Draft 
Standard, and Standard.

The draft-housley-two-maturity-levels proposes two levels: Proposed Standard 
and Internet Standard.


  "The request for reclassification is sent to the IESG along with an
   explanation of how the criteria have been met.
   The criteria are:

  (1) There are at least two independent interoperating implementations
      with widespread deployment and successful operational experience.

  (2) There are no errata against the specification that would cause a
      new implementation to fail to interoperate with deployed ones.

  (3) There are no unused features in the specification that greatly
      increase implementation complexity."

The document that has been the subject of innumerable messages highlights how 
difficult it can be to reclassify a RFC.  Moreover, it amplified the divide 
between application folks and operators.  The IESG could have used the review 
clause from RFC 2026 and:

 (i)  requested an implementation report from the people in favor of the
      proposed standard; and

 (ii) a statement about deployment to determine whether there are operational
      issues that have to be addressed.

I don't know whether application folks and operators agree that cross-area 
requires mutual understanding.

The creation of interoperable protocol implementations requires clear 
specifications.  Interoperability does not mean that the protocol would not 
have unintended effects on the network.  That is where operational experience 
comes in.  It can serve as valuable input to improve a specification.  For what 
it is worth, there approximately 75 implementation reports have been submitted 
since 1996.

I am well aware of the implementation reports.  The premise here is that the protocol 
specification is "good enough" there are at least two interoperable 
implementations and the protocol is deployed widely.  The implementation report would 
become optional.


A two-step maturity level folds the two different classes of issues into one.  
Quoting RFC 5657, which this draft unfortunately does not reference:

 "Moving documents along the standards track can be an important signal
  to the user and implementor communities, and the process of
  submitting a standard for advancement can help improve that standard
  or the quality of implementations that participate."

During a discussion on another mailing list, it has been mentioned that such an 
effort has a cost.  Lumping all the issues together can only increase that cost.

People are not doing many implementation reports.  As you say above, there are 
only about 75 of them.  How many protocols are documented in RFCs?  That is a 
very low percentage in my view.

So, I see the cost quite differently.  Most protocols are published as Proposed 
Standards, and they are never advanced.  I'm seeking a process where 
implementation and deployment experience actually improves the protocol 
specifications.  Today, that rarely happens, and when it does, the document 
recycles at Proposed Standard and the reader does not know whether it happened 
or not.


Strangely, this draft argues for measuring "interoperability through widespread 
deployment of multiple implementations from different code bases".  It will be more 
difficult to remove features once implementations are widely deployed.  Keeping the 
feature fight within the Draft Standard  discussion can reduce the level of controversy 
at the last step.  As a side note, it would be less costly if feature evaluation was 
based on implementations instead of what people would like to keep (only one 
implementation of a feature).

This is an argument for the status quo.  We have decades of experience with 
that not working.  That is essentially an argument for a single maturity level; 
that is how the process is really working today.


Once a Proposed Standard is published, it is expected that people will go and 
write code, if they have not done so yet, and evaluate whether their 
implementation can interoperate with other implementations.  I don't see 
anything in this draft that encourages that.

Of course that is what we want.  Even better if the running code is used to 
make decisions before the protocol become a Proposed Standard.


In the RFC 5000 range, there are 7 Internet Standards, 13 Draft Standards and 537 
Proposed Standards.  One of the arguments for this draft is that it reduces the number of 
IESG evaluations, and other review cycles, from three to two.  Basically, this draft is 
to adjust the environment so that the IESG can review everything.  It does not reduce the 
barrier of "intended proposed standards" to the RFC 2026 level.  It does not 
offer any incentive to advance document along the Standards Track.

This document is not about IESG review time, except for the elimination of the 
requirement for annual reviews which are not done anyway.  If that is what you 
get from the document, then I have done a very poor job in writing it.  That is 
not the point at all.

Russ
_______________________________________________
Ietf mailing list
Ietf(_at_)ietf(_dot_)org
https://www.ietf.org/mailman/listinfo/ietf








_______________________________________________
Ietf mailing list
Ietf(_at_)ietf(_dot_)org
https://www.ietf.org/mailman/listinfo/ietf

<Prev in Thread] Current Thread [Next in Thread>