ietf-smime
[Top] [All Lists]

Re: [saag] [Cfrg] Further MD5 breaks: Creating a rogue CA certificate

2008-12-30 17:11:30

--On Tuesday, December 30, 2008 01:39:34 PM -0800 Eric Rescorla <ekr(_at_)networkresonance(_dot_)com> wrote:

At Tue, 30 Dec 2008 12:53:06 -0800,
Paul Hoffman wrote:

At 1:33 PM -0500 12/30/08, Jeffrey Hutzelman wrote:
> This is a practical application of an approach that I remember being
> brought up during discussions about MD5 at a saag meeting some time
> ago.  I also recall someone mentioning at the time that many/most CA's
> were already issuing certificates with random rather than sequential
> serial numbers, which would have thwarted this particular attack.

Your recollection may be off. I believe I was the person who brought
up the serial number hack at the mic, and I'm pretty sure I said
"some", not "many" (and certainly not "most"!). When I looked at a
handful of popular CAs earlier this week, I only found a few who are
using randomization in their serial numbers.

So it's in my deck from SAAG at IETF 62.

http://www.ietf.org/proceedings/05mar/slides/saag-3.pdf

I don't know whether many or most do it. IMO everyone should.

I just checked my records, and shortly after that IETF, our internal CA started issuing certificates with SHA-1 signatures and randomized serial numbers, as a direct result of that discussion.



> RFC 5280 does not include this advice.  It is sound advice that was
> discussed in PKIX and other venues.  Perhaps a BCP is in order.

Man, that is really stretching the definition of "best".

For one, it is only needed in signatures that use known-attackable
hash functions. A "best practice" in that case is to use a better
hash function in the signature. Also, it relies on the ability of
the software using the random number to be sure that the result is a
positive integer in ASN.1, which seems overly optimistic.

On the contrary, IMHO best practice is to take every reasonable measure to reduce the likelyhood of an attack. In my book, that means both using a better hash function _and_ using randomized serial numbers, since the latter clearly helps when the hash is broken, and hash functions may become broken over time, and before you realize it.



If the IETF feels that adding randomization to signatures is
important, we should define randomized signature functions.

I think that's a very good idea.  However...

I certainly agree we should be encouraging the use of non-broken hash
functions. However, randomizing the SN seems like very cheap backward
compatible insurance against the fact that that's going to take a long
time.

"what he said".



Incidentally, the recently reported problems with CBC mode ciphers in SSH have gotten me to thinking that in some situations, a single REQUIRED algorithm isn't enough, because if something goes wrong and you have to abandon that algorithm in a hurry, operators may be in a position of having to choose between seriously compromising either security or interoperability.

In the case of usages like certificates where no live negotiation is possible and implementations may have to interoperate over a long period of time, I believe additional care is necessary. For example, I think it would be a good idea to define a composite signature function which uses more than one hash computed independently. This would likely make some attacks harder, but more importantly, it means that as long as at least one of the underlying hashes is strong enough, the signature is still usable.

-- Jeff