arrays) use zero-based indexing: the first 16-bit value is at position 0, the second at
a special type that represents a single element of a string. To represent a single 16-bit
value, simply use a string that has a length of 1.
strings are sequences of unsigned 16-bit values. The most commonly used Unicode
characters (those from the “basic multilingual plane”) have codepoints that fit in
16 bits and can be represented by a single element of a string. Unicode characters whose
codepoints do not fit in 16 bits are encoded following the rules of UTF-16 as a sequence
of length 2 (two 16-bit values) might represent only a single Unicode character:
var p = "π"; // π is 1 character with 16-bit codepoint 0x03c0
var e = "e"; // e is 1 character with 17-bit codepoint 0x1d452
p.length // => 1: p consists of 1 16-bit element
e.length // => 2: UTF-16 encoding of e is 2 16-bit values: "\ud835\udc52"
ues, not on characters. They do not treat surrogate pairs specially, perform no normal-
ization of the string, and do not even ensure that a string is well-formed UTF-16.