Lakshminath Dondeti <ldondeti(_at_)qualcomm(_dot_)com> writes:
I have reviewed documents as a Gen-ART reviewer (during Brian's tenure I
think), sec-dir reviewer and also provided IETF LC comments on some
documents. As a reviewer, I am not sure whether I was expecting answers
all those times. I am pretty sure I have not always stated whether or
not the answers are satisfactory.
On this one point.
IMO, one of the biggest causes of problems (and most under-appreciated
process weakness) in the IETF (and any consensus based organization
for that matter) is poor handling of review comments.
The ideal way to deal with them is to always respond, and to get an "I
am satisfied with your response" to close the thread. Anything else
runs the risk of:
- author says "I dealt with those comments in the revised ID", with
the reviewer saying "Nope, I was ignored".
- author says "I dealt with those comments in the revised ID", and
this actually was the case, except that they accidentally missed
one or two important issues. Reviewer is left wondering "was I
blown off, or was this just an oversight, or..."
- author thinking they are "done", because they "responded on the
list". But, no changes in the document and/or reviewer is still not
- reviewer having invested a not-insignificant amount of time doing a
quality review feeling "what's the point", which doesn't help to
motivate a volunteer organization. This is especially problematic
from the cross-area review perspective.
Repeat above several times and intersperse with long periods of time
where nothing happens on a document. You now have an idea of why it
seems to take a long time to get documents through the system.
One of the reasons I'm such a fan of issue trackers is that it tends
to remove a lot of the above stuff by simply not allowing stuff to
fall through the cracks. Sure, trackers have overhead and are overkill
in some cases. But if one could somehow analyze the number of
documents that have been delayed for some time due to poor handling of
IETF mailing list