Paul, Eric and everyone else,
> If the IETF feels that adding randomization to signatures is
> important, we should define randomized signature functions. We could
> start with NIST Draft SP 800-106
> I think that doing so is sending the wrong message: we should
> instead be encouraging the use of non-broken hash functions.
and Eric responded
> I certainly agree we should be encouraging the use of non-broken hash
> functions. However, randomizing the SN seems like very cheap backward
> compatible insurance against the fact that that's going to take a long
I believe that the best answer to the above arguments regarding the use of
randomized hashing vs. the "patch" of using random SNs is given by Sotirov
al, the cryptanalysts that carried out the remarkable MD5-certificate attack
(see their website). They say:
*We do note however that this use of randomness in the serial number is a
workaround, made possible by lucky choices in the X.509 standard. It is
bad idea in general to add randomness to a hash input when a possible
is able to choose the input. A much more reliable, since designed,
to use randomized hashing, see [HK]. Such a solution introduces
as a "mode of operation" for hash functions, which is a much more
approach to the problem than relying on features that happen to be present
existing standards for non-security reasons, or for no reason at all.*
In this light, I disagree with Paul's statement:
I think that doing so is sending the wrong message: we should
instead be encouraging the use of non-broken hash functions.
The two things are not exclusive. We should do BOTH:
Adopt a randomized hashing technique (as a mode of operation for hash
functions) and continue our quest for better hash functions.
We must aim at the best possible hash function, but we cannot guarantee its
security in the long term. As stated in the above text by Sotirov et al,
randomized hashing is a more fundamental (I would call it "infrastructural")
approach. It strengthens digital signatures with any hash function and for
digital signature application, not just certificates.
Let's take the example of HMAC: Its development in mid-90's was motivated to
large extent by the weaknesses found in MD5 by Dobbertin and others.
If we took the approach of "let's use a better hash function" we should have
adopted the key-append method
MAC(K,X) = SHA1(X||K)
which used SHA1 for which there was ample belief that it was a very good
collision resistant hash function. However, had we done that, we would now
have a broken MAC since the above design breaks with collisions on the
underlying hash function.
I believe that the responsible course of action for the IETF and
SAAG is to adopt the standardization process started by NIST with
and create a deployment path that could accommodate a randomized hashing
approach as a mode of operation. This includes the consideration of
hashing in protocol changes and new protocol design that support algorithm
No one in the world will think that by doing that we should keep using MD5,
or not pay attention to NIST's hash competition, or should stop from moving
The just-published attack indicates that it is time that we take seriously
digital signatures, and randomized hashing is the best long-term insurance
we know against future collision vulnerabilities.
You can find more details on the specific randomized hashing approach behind
NIST's document in http://www.ee.technion.ac.il/~hugo/rhash/
In particular, some of the documents in that site provide some guidance
regarding implementation and deployment issues (it also includes a