> I think this is where the communication breakdown occurred, our use of
> the "ASCII" term. In PC lingo (and in all PC based literature since
> about 1986 on the subject), ASCII is assumed to be the PC character set
> for those of us from the "DOS/Windows world". To unix folks, "ASCII"
> means something different -- a richer and more specific definition as to
> what this refers to (which is why I and some unix folks seem to talk
> past each other on this topic).
ASCII has meant, from the prehistory of computing on, the American Standard
Code for Information Interchange. It is a 7 bit code, represented in octets
with the leading bit a partity bit for error detection. As a true,
died-in-the-wool standard, most computer, terminal, printer, modem, ...
manufacturers adopted it. As an operating system that runs on a wild
veriety of hardware, and talks to an even more bewildering variety of
peripherals, Unix (and thus Linux) by and large uses true ASCII (but adds a
few quirks of its own, like using '\n' and not "\r\n" to end a line). Then
came along the PC, which had no use for the parity bit, and Bingo! Yet
another non-standard is born by using it to get space for a new bunch of
characters. The copious PC-centered literature (which you mention, and
which is mostly clueless anyway) deluded a whole generation that this was
ASCII. The same literature also assumed every screen could handle its
non-ASCII characters, which is _very_ far from the truth. There are even
terminals that can't even handle the full ASCII set! Thankfully now on the
brink of extinction.
ascii(7) elaborates.
-- Horst von Brand vonbrand@sleipnir.valparaiso.cl Casilla 9G, Viņa del Mar, Chile +56 32 672616- To unsubscribe from this list: send the line "unsubscribe linux-kernel" in the body of a message to majordomo@vger.rutgers.edu Please read the FAQ at http://www.tux.org/lkml/