I guess one of the first questions should be; "Is some partitioning of the
Internet community such a bad thing?"...
If the "partition" intended for discussion is "@sign vs !path" addressing
conventions, Eric Allman and Peter Honeyman have left a discussion archive
on the subject. Arguably the universalist thesis understated the drawbacks
of anyone having the capability of addressing everyone anywhere. Clueless
users is only one possible policy model -- a point made by Peter then, and
equally valid now.
Personally I'm underwhelmed by the universalism advocated by the members
of the UNICODE Consortium, a single encoding scheme of necessity comes to
peripheral markets late in their adoption of computerized writing systems,
and their integration into a rationalized global system is not obviously a
boon to their pre-integration service models.
PS: I think it is without doubt that it is a Good Thing that we make
efforts to internationalize protocols ...
Even less satisfactory is the practice of generalizing ASCII (nee BCD) to
encodings with more than 256 code points, via this universalist scheme and
no other. To advance from ASCII to ASCII-plus-UTF8 could be just as well
characterized as SJIS/GB/Big5/... (and their uses) depricated.
... my comments/questions are an
attempt to explore how far this process can reasonable go.
The i18n problem isn't trivial, and isn't advanced by problematic essays,
good intentions, or American (actual and honorary) indulgences.
On the up-side, large user bases need not adapt to extraneous requirements
for participating in the "Internet community", and Universalist Credos may
fail in the markets (plural intended).
As for poking the ICANN mess in the eye with a sharpened brush on the IETF
list prior to a meeting, it is clumsy slight-of-hand and a poor substitute
for work on writing system support. See also the W3C WAI for information
encoding and presentation systems which are not "writing".