ietf
[Top] [All Lists]

Re: IONs & discuss criteria

2008-03-08 08:04:17
On 3/7/2008 11:18 AM, Thomas Narten wrote:
Lakshminath Dondeti <ldondeti(_at_)qualcomm(_dot_)com> writes:

I have reviewed documents as a Gen-ART reviewer (during Brian's tenure I 
think), sec-dir reviewer and also provided IETF LC comments on some 
documents.  As a reviewer, I am not sure whether I was expecting answers 
all those times.  I am pretty sure I have not always stated whether or 
not the answers are satisfactory.

On this one point.

IMO, one of the biggest causes of problems (and most under-appreciated
process weakness) in the IETF (and any consensus based organization
for that matter) is poor handling of review comments.

The ideal way to deal with them is to always respond, and to get an "I
am satisfied with your response" to close  the thread. 

"Ideal" being the keyword though.  Not everyone, for any number of 
reasons, including cultural reasons, will come out and state "all 
clear."  It is also asking too much to ask the reviewer to get into a 
debate with the authors.  It also fosters an environment where the 
reviewer starts becoming an authority.

Anything else
runs the risk of:

 - author says "I dealt with those comments in the revised ID", with
   the reviewer saying "Nope, I was ignored".

 - author says "I dealt with those comments in the revised ID", and
   this actually was the case, except that they accidentally missed
   one or two important issues. Reviewer is left wondering "was I
   blown off, or was this just an oversight, or..."

 - author thinking they are "done", because they "responded on the
   list". But, no changes in the document and/or reviewer is still not
   happy.

 - reviewer having invested a not-insignificant amount of time doing a
   quality review feeling "what's the point", which doesn't help to
   motivate a volunteer organization. This is  especially problematic
   from the cross-area review perspective.

I am not sure this is the "right" kind of motivation here.  As one such 
reviewer, all I look for is thanks from the AD whose directorate or team 
I am serving on.  And they have always been thankful.  No, I do not 
constitute "representative population."  I am just offering one data point.


Repeat above several times and intersperse with long periods of time
where nothing happens on a document. You now have an idea of why it
seems to take a long time to get documents through the system.

Indeed.  What started out as a great idea -- I volunteered to be a 
GenART reviewer 3-4 years ago now -- is beginning to become yet another 
burden in the process.


One of the reasons I'm such a fan of issue trackers is that it tends
to remove a lot of the above stuff by simply not allowing stuff to
fall through the cracks. Sure, trackers have overhead and are overkill
in some cases. But if one could somehow analyze the number of
documents that have been delayed for some time due to poor handling of
review comments...

Yeah, it would be interesting.  Although, I wonder what we will do with 
that information.  Reviewers are not accountable for delays.  There is 
no expectation of time commitment from them, for instance.


Thomas

_______________________________________________
IETF mailing list
IETF(_at_)ietf(_dot_)org
https://www.ietf.org/mailman/listinfo/ietf

<Prev in Thread] Current Thread [Next in Thread>