ietf
[Top] [All Lists]

Re: "why I quit writing internet standards"

2014-04-17 07:16:52

On Apr 14, 2014:11:57 AM, at 11:57 AM, David Meyer <dmm(_at_)1-4-5(_dot_)net> 
wrote:

On Mon, Apr 14, 2014 at 8:08 AM, George, Wes 
<wesley(_dot_)george(_at_)twcable(_dot_)com> wrote:
I’m surprised that no one has sent this out yet:
http://gigaom.com/2014/04/12/why-i-quit-writing-internet-standards/

"Summary: After contributing to standards organizations for more than seven
years, engineer Vidya Narayanan decided it was time to move on. Although she
still believes that these organizations make the Internet a better place,
she wonders about the pace of change versus the pace of organizations."

My thoughts-

There are some nuggets of truth in what she says in this article, and in
some of the comments. I think that the problems are real, so there’s value
in taking the criticism constructively, despite the fact that the author
chose to focus on the problems without any suggestions of solutions.

"while the pace at which standards are written hasn’t changed in many years,
the pace at which the real world adopts software has become orders of
magnitude faster."
…
"Running code and rough consensus, the motto of the IETF, used to be
realizable at some point. … In the name of consensus, we debate frivolous
details forever. In the name of patents, we never finish.”
…
"Unless these standards organizations make radical shifts towards
practicality, their relevance will soon be questionable.”

I don’t have too many big ideas how to fix these problems, but I’ll at least
take a crack at it in order to spur discussion. My paraphrase of the problem
and some discussion follows.

- We’ve lost sight of consensus and are too often derailed by a vocal
minority of those willing to endlessly debate a point.

Part of the solution to that is reiterating what consensus is and is not,
such as draft-resnick-on-consensus so that we don’t confuse a need for
consensus with a need for unanimity. Part of the solution is IETF leadership
helping to identify when we have rough consensus encumbered by a debate that
will never resolve itself, without quieting actual disagreement that needs
continued discussion in order to find a compromise. I don’t have good
suggestions on how to make that second half better.

- We don’t have nearly enough focus on running code as the thing that helps
to ensure that we’re using our limited cycles on getting the right things
out expediently, and either getting the design right the first time, or
failing quickly and iterating to improve

The solution here may be that we need to be much more aggressive at
expecting any standards track documents to have running code much earlier in
the process. The other part of that is to renew our focus on actual interop
standards work, probably by charter or in-group feedback, shift focus away
from BCP and info documents. Perhaps when considering whether to proceed
with a given document, we need test as to whether it’s actively
helpful/needed and ensure that we know what audience would be looking at it,
rather than simply ensuring that it is “not harmful” and mostly within the
WG’s chartered focus.

My friend @colin_dixon pointed this out to me yesterday, and I've been
giving it quite a bit of thought since then (I have a nascent blog on
the topic of how open source and standards orgs might
productively/efficiently work together; follow up to
http://www.sdncentral.com/education/david-meyer-reflections-opendaylight-open-source-project-brocade/2014/03).

What I can say is that after seeing the kind of progress that several
open source communities make (they do epitomize the best of the IETF's
running code/rough consensus ethic), one does have to wonder if
traditional standards making is either obsolete or in dire need of a
make over. What is needed, IMO, is a reimagining of how the standards
process interacts with the open source movement specifically focused
on how they can compliment one another.

--dmm

        I think its the latter - a make-over or an adjustment are needed to how 
we work within this organization. The pendulum has clearly swung so far into 
the direction of over-caution, over-thinking, and over-design (i.e.: 
requirements, problem statements, architectures and frameworks) that the real 
cost is too high in terms of real dollars invested on the process, as well as 
real product development delay where people have to wait for the process to 
maybe put something out years from its onset. We also need to directly address 
the interference in the process by individuals and organizations who will never 
implement the things they are here to "work" on things in theory. In open 
source, code is the coin of the realm, so its very much a "put up or shut up" 
model; opinions don't count for much if anything. So if what you bring to the 
party is a code patch that everyone agrees on, thats cool, but if you just say 
"hey I'd like to add option X or lets do this another way", you really are out 
in the cold. We need to get back to a model where the opinions of those 
implementing things and those really deploying those things is what really is 
what matters.

        Until then, the most important observation is that the process has 
ground nearly to a halt in many areas which has resulted in a severe reduction 
in the speed of innovation. The market has responded and gone elsewhere. That 
is a fact, and denying it is unrealistic. Take note: many of us have seen and 
are involved in real examples of this from even traditional service providers, 
who are now actively demanding open source-based products and/or are even 
trying to launch their own agile/open source-based initiatives to foster rapid, 
collaborative AND interoperable product creation in leu of the IETF process.  
But we need to get back to running code and rough consensus driving the 
process, rather than how things are now.

        --Tom





Attachment: signature.asc
Description: Message signed with OpenPGP using GPGMail