ietf
[Top] [All Lists]

Re: Outstanding reviewers (was: Proper credit for work done -- on finding chairs)

2013-10-19 16:35:03
At 12:17 19-10-2013, Joe Abley wrote:
Not to contradict you at all (I like where you're going with this) but
I've noticed that the diligence with which authors maintain the
acknowledgements sections if their documents is quite variable.

Yes.

I realise it's not structured data, but it's what we have; it would be
good, I think, if working group chairs, document shepherds and
responsible ADs asked specifically about the text in that section and
encouraged/reminded authors to make sure it is complete.

Yes.

This does not get us where you infer we might be aiming, but I think
it would be a step in the right direction.

I was not thinking of the Acknowledgements section. There is a document shepherd write-up. That write-up can contain information about a draft, and among things the depth or breadth of the reviews. A document shepherd can include the names of some reviewers and what they did.

At 12:35 19-10-2013, Brian Trammell wrote:
In my previous message on the subject
(http://www.ietf.org/mail-archive/web/ietf/current/msg83204.html) I'd
considered using the datatracker to improve the "assignment" or at least
tracking the of coverage of reviews of documents. I'd thought of this
primarily as a method to reduce final review workload (in the context of
CHANGING THE JOB), but this would be another very good reason to add
reviewer information to the datatracker.

I read that message. :-)

This is a problem with any approach to review tracking using the
datatracker -- if it's not done consistently nobody will come to rely on
it. More automated tool support would help with consistency.

Yes.

I think that it is better to let people decide whether they want to use the feature or not.

A first pass at this would simply note reviewer, date, document
revision, and sections covered, along with the content of the review
(which, if incoming references to mailing list archives are considered
to be stable, which they should, could in most cases simply be a URL).
That would be sufficient for identifying reviewers, and would probably
do a lot to support triage of well-reviewed from poorly-reviewed
documents before they get sent up to the IESG.

I would keep it simple; I only need the email address and the link to the public message to generate a report. The information is included for transparency reasons. I did not mention sections covered to avoid complexity.

Regardless of how the recognition gets disseminated, there's a lot of
value in starting to collect the data.

Yes.

Regards,
-sm
<Prev in Thread] Current Thread [Next in Thread>