JavaScript: The Definitive Guide, Sixth Editio javaScript权威指南(第6版) pdf 文字版-文字版, javascript电子书, 和javascript 有关的电子书:

3.2 Text

A string is an immutable ordered sequence of 16-bit values, each of which typically
represents a Unicode character—strings are JavaScript’s type for representing text. The
length of a string is the number of 16-bit values it contains. JavaScript’s strings (and its
arrays) use zero-based indexing: the first 16-bit value is at position 0, the second at
position 1 and so on. The empty string is the string of length 0. JavaScript does not have
a special type that represents a single element of a string. To represent a single 16-bit
value, simply use a string that has a length of 1.
Characters, Codepoints, and JavaScript Strings
JavaScript uses the UTF-16 encoding of the Unicode character set, and JavaScript
strings are sequences of unsigned 16-bit values. The most commonly used Unicode
characters (those from the “basic multilingual plane”) have codepoints that fit in
16 bits and can be represented by a single element of a string. Unicode characters whose
codepoints do not fit in 16 bits are encoded following the rules of UTF-16 as a sequence
(known as a “surrogate pair”) of two 16-bit values. This means that a JavaScript string
of length 2 (two 16-bit values) might represent only a single Unicode character:
var p = "π"; // π is 1 character with 16-bit codepoint 0x03c0
var e = "e"; // e is 1 character with 17-bit codepoint 0x1d452
p.length     // => 1: p consists of 1 16-bit element
e.length     // => 2: UTF-16 encoding of e is 2 16-bit values: "\ud835\udc52"
The various string-manipulation methods defined by JavaScript operate on 16-bit val-
ues, not on characters. They do not treat surrogate pairs specially, perform no normal-
ization of the string, and do not even ensure that a string is well-formed UTF-16.
欢迎转载,转载请注明来自一手册:http://yishouce.com/book/1/27606.html
友情链接It题库(ittiku.com)| 版权归yishouce.com所有| 友链等可联系 admin#yishouce.com|粤ICP备16001685号-1