Why don't scripting languages output Unicode to the Windows console?
Posted
by
hippietrail
on Stack Overflow
See other posts from Stack Overflow
or by hippietrail
Published on 2011-02-09T07:23:36Z
Indexed on
2011/02/09
7:25 UTC
Read the original article
Hit count: 218
The Windows console has been Unicode aware for at least a decade and perhaps as far back as Windows NT. However for some reason the major cross-platform scripting languages including Perl and Python only ever output various 8-bit encodings, requiring much trouble to work around. Perl gives a "wide character in print" warning, Pythong gives a charmap error and quits. Why on earth after all these years do they not just simply call the Win32 -W APIs that output UTF-16 Unicode instead of forcing everything through the ANSI/codepage bottleneck?
Is it just that cross-platform performance is low priority? Is it that the languages use UTF-8 internally and find it too much bother to output UTF-16? Or are the -W APIs inherently broken to such a degree that they can't be used as-is?
© Stack Overflow or respective owner