If a sieve implementation is not subject to this bug, then most
likely it could support \0 pretty easy. Otherwise some work is
required anyway to fix the bug.
I suggest you propose an extension. an implementation that follows
the current RFC should not be rendered incompatible.
I thought about an extension, too, but what should be the result
of the following two matches (as specified by the RFC) given
those comparisons, if we have the header
Subject: =?iso-8859-1?q?abc=00def?=
and the tests:
header :contains ["Subject"] ["abc"]
header :contains ["Subject"] ["def"]
header :matches ["Subject"] ["abc?def"]
An implementation that evaluates the second or third test as false is
broken, isn't it? To fix that, adding NUL support is required. It
would not shed a good light onto sieve if the above tests only evaluate
as true if the scripts begins with
require "no_NUL_bug";
And even that would require conforming implementations to artificially
introduce the bug when that extension is not selected.
if you want the "\0" escape, you can't express U+ef followed by zero.
\uef\0 => U+ef NUL
"\u0" is so short "\0" is superfluous. the number of escapes whould
be kept to a minimum.
Not entirely. RFC 3028 allows not to implement UTF-8 comparisons,
which means the input script may be in UTF-8, but the implementation
is not aware of that, and it makes no sense to use anything but
ASCII characters in that case. That allows a simple implementation
and such an implementation probably would not understand \u escapes,
but it still requires \0 to encode the NUL character.
Michael