On 31 dec 2014, at 08:46, Nico Williams <nico(_at_)cryptonector(_dot_)com>
wrote:
The NFD case is obnoxious because even on those systems the input system
tends to produce NFC... But anyways. When you have no canonical form
for whatever reason, you can try form-insensitive matching.
Ok, got it (and yes, I have been bitten myself a few times on the NFD issues
with HFS+).
What I think is then needed is for this case:
1. A simple explanation what you really is talking about
What is the requirement on whom regarding normalization/mapping/whatever?
2. An evaluation whether the choice is the right one or not
The tricky part regarding choice of normalization (together with selection of
code points allowed) is of course whether false positives or false negatives is
the most troublesome event when trying to do matching.
I.e. say that matching algorithm is not defined. Is there a larger risk you
will get false positives, and because of that possible attacks using some kind
of "hamming-distance"/"homograph" (based on normalization)? Or rather, a
description must be part of the security considerations section to explain what
should not be done to not increase the risk for such attacks.
Let me just be clear here, I am very much in favor of specifications that say
that "server side matching" should NOT do normalization, as that give most
freedom for the applications that use whatever mechanism is defined. But, that
to me set requirements on "client side" to do the right thing (for example like
in IDNA2008 only be allowed to use certain code points).
So, given your choice on server side matching, what are the requirements on
client side?
Patrik
signature.asc
Description: Message signed with OpenPGP using GPGMail