I would posit there is another dynamic I think we are missing. A lot of my
academic work around Cyclomatic Complexity is about relating the complexity of
code to latent defects in programs AFTER PEOPLE THINK THEY FOUND ALL OF THE
BUGS.
I think the analogy here is that AD review, cross-area review, and sacrifices
to Amaterasu will not find ALL of the bugs in a specification. The best we can
hope for is to do best practices to keep the bugs to an acceptable, low level.
On Dec 23, 2015, at 8:46 PM, Ted Lemon
<Ted(_dot_)Lemon(_at_)nominum(_dot_)com> wrote:
Dave Crocker wrote:
AD review appears to miss far more than it catches. That's seems to be
difficult for some folk to accept, and so they point to the individual
exceptions rather than the longer pattern of misses. (By way of some
very strategic examples, consider IPv6 and DNSSec and email security and
key management and...)
Do you have data on this? This doesn't match my experience. I've found AD
review to be very helpful. It's certainly inconvenient, but I've seen AD
reviews catch lots of things that were worth catching. I would not go so
far as to claim that my anecdote is more valid than yours, but I think that
if we are going to make claims like this, we should probably back them up
with data. The current AD review process didn't just happen.
signature.asc
Description: Message signed with OpenPGP using GPGMail