ietf
[Top] [All Lists]

Re: I-D ACTION:draft-klensin-iana-reg-policy-00.txt

2005-07-13 15:21:46
   I generally support John Klensin's proposal. (Of course, the devil is
in the details...)

   I have not replied to previous messages, because I really didn't have
anything to add to the non-discussions there. But Hans raises some points:

Hans Kruse <kruse(_at_)ohiou(_dot_)edu> wrote:

For limited option spaces, "stewardship" is needed, but what does this 
mean? 

   "Stewardship" should mean actual, effective, _use_ of a resource.

The document(s) being revised/created should give some guidance to
the community on how to use "percentage of space free" and/or "rate 
of consumption" to make reasonably consistent choices.

   I don't disagree; but we're not ready for that discussion.

Then there is "Technical Review"; I actually agree that a level of
such review is required, but is must be LIMITED (unless of course the 
protocol itself is being IETF reviewed).  In a case like the one that 
triggered this debate, the review needs to include only:

1. If the option appears in a packet, will there be any possible 
negative impact on a network element that has no code to process the 
option.

   This is a genuine issue in the proposal by Dr. Roberts...

   But, IMHO, it's far better to _review_ that issue, and document the
negative impacts (which we have failed to do for the issue at hand).

2. If option space is limited, does the documentation suggest that 
deployment will actually take place (lets not assign limited codes to 
high school science projects).

   When the option space _is_ limited, John Klensin calls for
"initiation of a plan to eliminate the scarcity" before restricting
assignments to "those required to meet the needs of IETF protocol
development". I quite agree (though I'm prepared to allow for using
judgment short of the "restriction" John described).

   Hans is introducing a new idea: making a judgment about how likely
it is that "deployment will actually take place". This is an example
of judgment short of the "restriction" John described. I find it
reasonable, though I wouldn't rule out other "reasonable" judgment.

I fundamentally disagree with "There's every reason that the same
standard should apply to specifications developed outside the IETF
exactly as to IETF documents" for the simple reason that it is 
non-enforceable. 

   I also fundamentally disagree, but primarily because it asks for
omniscience, and our supply of potential IESG members is limited
enough without adding "demonstrated omniscience" to the requirements.

   Among the possible "plans to eliminate scarcity" might be to
assign a range of values for limited-time experimental use: keeping
in mind that our primary purpose in registration should be (as
John Klensin well stated) "Clarity of Definition". We want to have
a way of knowing what an option is being used for: the value of
knowing this easily outweighs the risk of experiments outlasting
their alloted time.

Beyond stewardship of limited code point space, I see no
justification for the IETF having veto power over standards being 
developed to use public standards like IP.  The fact that such 
independent developments are possible is at the heart of the
success of the Internet.

   I agree with Hans here: a _large_ portion of the success of the
Internet has come from being able to develop new ideas without
waiting for the IETF _at_all_.

Do we really think there is or will be a rush to standardize IETF-like 
protocols in TIA, ITU, etc.? I don't think so unless the IETF really 
falls on its face as far as cranking out well-engineered solutions.

No, the little bit of "competition" will come in cases like this one, 
where a protocol is designed for a corner case in an organization with 
expertise for that corner case.  Again, as has happened here, the IETF 
probably does not have the right set of expertise to fully review the 
protocol, and it should not feel the need to do so.

   Understand, the IETF _does_ have the right set of expertise to
review the _risks_...

   But I've seen no point in discussing here the intents and benefits
of Dr. Roberts' proposal: we may or may not have the expertise to
understand it all, but that expertise is not on _this_ list; and
there's insufficient justification for trying to gather it onto
any particular list.

   We _do_ need some work, being done somewhere, on the issues Dr.
Roberts' proposal seeks to explore. We should welcome that work,
wherever it might be done.

It seems that you want a review of "is this protocol safe to deploy
on the Internet"?  I can see the reasoning behind that, but I think
the code point assignment review is the wrong place. 

   I partly agree with Hans here; but we're likely to get stuck
needing to review it at that point.

   What we do need (in that case) is to minimize the work of that
review, if it comes at an inconvenient time.

   The time-honored way to minimize the work of such a (mini) review
is to minimize the scope of what we claim to have reviewed. Thus,
if we limit ourselves to listing a few areas of concern, stating we
have not satisfied ourselves that these have been sufficiently
considered, we don't need to spend a lot of time explaining how
we reached that belief -- whereas if we state there _are_ problems
but decline to list them, we're opening ourselves to unending
discussions and unlimited research needs.

... purists in the IETF will congratulate themselves for having
fended off the dragon, while the Internet operators have to cope
with a "stealth" version of the very protocol the purists tried to
stop, instead of being able to filter on a known option number.

   In the case at hand, Dr. Roberts wants to collect hop-by-hop
actually-available-bandwidth information in order to bypass the
slow-start mechanism of TCP. This _is_ scary!!!

   But the fact is, Dr. Roberts is working in an environment where,
with or without this hop-by-hop information, people _are_ going to
bypass slow-start. :^(

   So, if we dissuade Dr. Roberts from collecting the information
he wants, we're _very_ likely to end up in a worse state than if we
allow him to collect it. :^( :^(

   Thus, IMHO, we've got a clear case where assigning the code, in
combination with listing some concerns, is our best option.

   (That said, I do not mean to criticise the IESG for not seeing
the "superior wisdom" of my insight. ;^)

What if, in the case above, the code gets assigned along with 
publication of an RFC that in fact says that the code in question 
"belongs" to another organization and represents a non-IETF
protocol that operators should filter unless they understand the
implications of carrying these packets...

   I recommend seriously considering such a possibility.

Now the Internet is actually safer, and there is an incentive for
authors of protocols that were intended for wider use, because
they will actually have to run the entire protocol through the IETF
to get off the "black-list".

   :^)

--
John Leslie <john(_at_)jlc(_dot_)net>

_______________________________________________
Ietf mailing list
Ietf(_at_)ietf(_dot_)org
https://www1.ietf.org/mailman/listinfo/ietf