David Barr <barr(_at_)visi(_dot_)com> writes:
It doesn't take much common sense to see those people putting 8-bit data
in headers are precisely the ones who think a) 8-bits in headers work
(for their purposes)
Yes.
and thus b) would be most likely to adopt an 8-bit standard.
I don't see how this necessarily follows. In fact, I believe that the
people who are currently using only ASCII are more likely to adopt a UTF-8
standard than the people who are currently sending 8-bit data in headers,
which is the exactly opposite of your point (b).
Most importantly they would likely be the *least* likely to accept going
back to 7 bits and encoding if they already have experience that 8-bits
work.
This is probably true, yes.
Given the history of how many standards have evolved in the past, I
can't much reason to agree with you. Protocols are *full* of examples
of this sort of evolution, whereby technically illegal (but largely
working) practices evolved into an accepted form.
Yes. Done by standardizing what people are already doing.
Standardizing UTF-8 in headers is not standardizing what people are
already doing. There are considerably fewer people using UTF-8 in headers
than there are people who are using RFC 2047.
If you were talking about standardizing sending local 8-bit character sets
in headers, I could see how this argument would apply, but given that
we're talking about going from one workable but somewhat unpopular
standard to a brand-new standard that bears no resemblence to what people
are doing now, I don't believe the "evolution of standards by
standardizing technically invalid but existing practice" argument applies
at all.
--
Russ Allbery (rra(_at_)stanford(_dot_)edu)
<http://www.eyrie.org/~eagle/>