Benjamin Franz <snowhare(_at_)nihongo(_dot_)org> writes:
On 8 Jan 2001, Owen Taylor wrote:
Well, I'd go beyond this and say that it would be nice if Perl would use
the system iconv when available - iconv isn't the greatest interface,
but it is generally pretty workable, and if people use the system
capabilities, then you avoid an explosion of tables.
I vehemently disagree. I can think of little that would make Unicode
support in Perl more marginal than not being able to *rely* on a large
standardized set of conversions (ESPECIALLY the large CJKV tables that
seem to cause such 'size shock' among the non-aware).
In that universe, not only would the plethora of existing Unicode support
modules not be able to be deprecated, they would become the defacto
standard way of doing what Perl *itself* is supposed to be doing with
Unicode support. It does very little good and considerable harm to have
'Unicode' support for only small random subsets of the worlds encodings.
If we are going to tread that path, it would be better to rip core support
for encodings other than the core Unicode standards
(UTF8/UTF16/UTF32/UCS-4/UCS-2) completely out and ship Unicode::Map,
Unicode::Map8, Jcode and Unicode::MapUTF8 in the base distribution
(possibly after XSing them for performance) than that. Better no direct
support for national encodings than automatically and *system dependantly*
broken support.
What I'm suggesting is:
- Use the system iconv when it exists and is sufficiently good
- Use support included with Perl, or an external library such as libiconv
otherwise.
"sufficiently good" could even mean "has the exact same mapping tables as
Perl". It would certainly mean including things like the CJK mapping tables.
This is the approach that Pango and GTK+ take, and works fine. I
don't really see any reason for when I'm running Perl/GTK+ for it to
load up two copies of the Japanese mapping tables, just because there
is some other operating system that doesn't have Japanese mapping
tables in its libc.
Regards,
Owen