codePointAt() method returns the Unicode code point of the character at a specified index in a string.
syntax ↴
codePointAt(index) ↴
index The index (position) of the string character to be returned.
Note that the index is still based on UTF-16 code units, not Unicode code points.
The first position is 0, the second is 1, ...
Cannot use negative numbers to select from the end of the string.
codePointAt() and charCodeAt() methods are similar.
charCodeAt() method returns a number between 0 and 65553 (0xFFFF), so it may return lone surrogates, but only codePointAt() can return the full Unicode value, a number between 0 and 1114111 (0x10FFFF). Higher code points are represented by a pair of 16-bit surrogate pseudo-characters. Therefore, codePointAt() method returns a code point that may span two string indices.
codePointAt() method returns undefined if the given index cannot be found.
myString.codePointAt(index)