ietf-openpgp
[Top] [All Lists]

Re: [openpgp] "OpenPGP Simple"

2015-03-22 13:04:40
In this case the security problem was created by an unjustified assumption that 
the relying party would verify the canonical encoding 

So no, this is a problem the canonicalists caused. 

Sent from my difference engine


On Mar 22, 2015, at 12:17, Gregory Maxwell <gmaxwell(_at_)gmail(_dot_)com> 
wrote:

On Sun, Mar 22, 2015 at 3:48 PM, Peter Gutmann
<pgut001(_at_)cs(_dot_)auckland(_dot_)ac(_dot_)nz> wrote:
This issue has been known for a long, long time (well, I guess not by the
OpenSSL authors :-)

Yes, it was known by me in advance of that CVE as well.

In other words the PKIX approach is to decide on a wrong solution
(blacklists), and then to break other things (certificate IDs) in order to
perpetuate the wrong solution.

And yet, in the end of the day users who thought they were secure are
left insecure.

How many years of compromises must people be subjected to before we,
in industry, become mature enough to develop systems which remain
secure _in practice_ in the face of design and implementation errors
by avoiding designs which have repeatedly resulted in breaks and
defending in depth?

We cannot know in advance what procedures and protocols people will
build in the future. If our abstractions are less safe-- if they have
a large amount of surprising hidden behavior such as non-canonical
encodings-- then the review requirements for anyone attempting to
build on them becomes unreasonably large and the amount of failure
will increase. The true complexity of a modern application is beyond
what any one mind can fully grasp at one time, we all must manage
complexity by abstraction, and some designs lead to safer abstraction
than others.

That an approach that was taken, like blacklisting, by a downstream
user of a cryptosystem design which stupid and wrong may also be true
and it's fine to also say that when it is so... But that fact does not
excuse specifying a protocol which is a footgun when it could have
been avoided with little cost (or, in the case of BER.. lower cost,
since a complete BER implementation is very complex). People will do
stupid things, from time to time, if our cryptosystems can only be
secure with completely perfect use then we might as well give up and
go home because perfect use will not happen and demanding it at all
times is an unreasonable cost which can easily outweighs the benefits
of the tools.

Sometimes there is a trade-off where there is a exclusive choice
between a valuable feature and a footgun. In those cases, it often is
reasonable to accept the footgun.

I have _never_ seen such an argument for overcomplete encodings; other
than for the sake of compatibility with legacy systems (for
cryptographic tools this compatibility is inevitably lost due to other
reasons, like the legacy systems being insecure). The overcomplete
encodings massively increase the review and testing burden (the usual
response is to just fail to test sufficiently) and as a result hide
bugs. They inherently increase the communication overhead (not that it
matters, the context where they come up are are usually very
inefficient to begin with) when they are possible, but subsetting them
out (like DER does to BER) hardly increases the overhead compared to
'optimal' use.

Sadly, it is infeasible to uncover in advance all the corner cases in
a spec that will surprise people and contribute to vulnerability; but
in cases where we've /seen/ problems in the wild we should not respond
by blaming the victims that misused the use fragile constructions,
once we know they're fragile we should avoid them where possible.

_______________________________________________
openpgp mailing list
openpgp(_at_)ietf(_dot_)org
https://www.ietf.org/mailman/listinfo/openpgp

_______________________________________________
openpgp mailing list
openpgp(_at_)ietf(_dot_)org
https://www.ietf.org/mailman/listinfo/openpgp

<Prev in Thread] Current Thread [Next in Thread>