ietf
[Top] [All Lists]

Re: A proposal for a scientific approach to this question [was Re: I'm struggling with 2219 language again]

2013-01-08 08:45:53
The reasons have been discussed or at least alluded to previously in this thread. The short answer is we have been there and done that: 30 years ago.

All those tools were developed and used successfully in the 80s. I know of cases where doing the formal specification alongside the design phase caught lots of problems. However, there are two central problems: First, in general, programmers are lazy and just want to code. ;-) Using a formal method is a lot more work. Second, the complexity of the formal statements that must be written down is greater than the code. So there is a higher probability of mistakes in the formal description than in the code. Admittedly, if those statements are made, one has a far greater understanding of what you are doing.

Once you have both, there is still the problem that if a bug or ambiguity shows up, neither the code nor the formal spec nor a prose spec can be taken be the right one. What is right is still in the heads of the authors. All of these are merely approximations. One has to go back and look at all of them and determine what the right answer is. Of course, the more things one has to look at the better. (for a bit more, see Chapter 1 of PNA).

John

Which raises the obvious question:  Why do we not write protocol specs
in a formal specification language instead of struggling with the
ambiguities of natural language?

Theorem provers and automated verification tools could then be brought
to bear on both specifiations and implementations.



Dick
--

<Prev in Thread] Current Thread [Next in Thread>