On Monday, March 16, 2015, Derek Atkins <warlord(_at_)mit(_dot_)edu> wrote:
On Friday, March 13, 2015 6:20 PM, Falcon Darkstar Momot
I feel like perhaps this type of exhaustive testing is neither necessary
nor expected, and that a few end-to-end tests designed to exercise edge
cases could be combined with more exhaustive unit tests to achieve
The difficulty, as always, is proving that an actual implementation is
modular. In the case of OpenPGP, it really isn't: A lot of data has to
get carried between each stage to ensure conformance with the
Having implemented it myself, I disagree completely. It is absolutely
possible to create a modular implementation. See my Usenix Security
Talk on the PGP Message Processing Pipeline from.... 1996??
Well, first: You're Derek Atkins. Not everyone is as good of a coder.
Second: It is hard to *prove* modularity, because of how complicated the
semantics are. (I have been trying, off-and-on, using Frama-C, for a while.)
Protocol modularity is not evil.
Modularity is neutral. "Agility", as folks like to call it, is evil.
Well, it's a damn good thing we've had agility otherwise we'd have been
stuck with 3DES, SHA1 (or MD5!!), and probably either RSA or maybe
No: The reason there hasn't been any urgency to fix things like the CFB
mode problems, or the MDC, is that the current standard is too flexible.
I hate to use TLS as an exemplar of anything, but they have done a much
better job in this regard recently.
Where it is easy to provide flexibility, it is rarely useful. Where it is
useful, it is rarely feasible to provide. (E.g., a parameter to choose SIV,
encrypt-then-MAC, or robust AE.)
openpgp mailing list