ietf-openpgp
[Top] [All Lists]

Re: Suggested changes for DSA2

2006-03-28 02:14:58

On Mon, Mar 27, 2006 at 03:22:15PM -0800, "Hal Finney" wrote:

David writes:
For implementation of signature verification you can just take p and q
straight from the public key.  You don't need to guess since the key
has all the information you need.

With signatures, it is the verifier more than the signer who is vulnerable
and who needs to be protected.  The problem is that as the verifying
software it is my responsibility to provide some level of assurance to
the user about how strong this signature is.

Right, but it still boils down to whether or not the verifier trusts a
certain public key. Thus, the decision needs to be made on a per-key, rather
than per-signature basis. I am not arguing here; it is just a remark.
 
Right now at best we only report the key size.  I'd like to make sure that
q is as strong as p.  Otherwise we might see a 4096 bit key with a 160 bit
q, so it is really no stronger than a 1024 bit key.

That is not quite precise, either. Increasing the size of q and the size of p
protect against two different attacks. Large q's protect against (optimized)
brute-force or random guessing of the discrete logarithm, while large p's
protect against sieve methods. The relative strength of the two attacks is
not easily assessed. Sieve methods are getting better and better. Thus, in
the future it may very well happen that the balanced choice will be 4096
bits for p and 160 bits for q.

As desirable as describing strength in some one-dimensional quantity is, it
is hardly possible. NIST's choice of matching modulus sizes and group orders
reflect the balancing of time costs of state-of-the-art attacks with no
regard to memory costs. (by the way, this same approach is reflected in
declaring 3DES as strong as a 112-bit cipher -- as if 2^62 bits of memory
were free) This is a long-standing tradition in the main-steam crypto
community, but there is no universal consensus about this.

It is hard to report
to the user how strong a signature by that key should be considered to be.

Yes, it is. That is one reason not to reflect _our_ judgement (even if we
could ever come to an agreement) about it in the standard.

This problem goes away if we standardize on the q sizes that go with
certain p sizes.  That's what I'd like to do.  Any keys that break the
rules would be considered invalid. 

No, it won't go away. Moreoever, why would you declare |p|=1024 |q|=160
valid, while |p|=4096 |q|=160 invalid, while the second choice is clearly no
weaker than the first one?

Putting lower limits on both the modulus size and the group order makes more
sense, but that also does not merit more than a passing remark in the
standard. IMHO, of course.

-- 
Daniel