charCodeAt() method returns an integer representing the UTF-16 code unit at the given index.
syntax ↴
charCodeAt(index) ↴
index The index (position) of the string character to be returned.
The first position is 0, the second is 1, ...
Cannot use negative numbers to select from the end of the string.
charCodeAt() returns a number between 0 and 65535 representing the UTF-16 code unit at the given index.
charCodeAt() always indexes the string as a sequence of UTF-16 code units, so it may return lone surrogates.
To get the full Unicode code point at the given index, use codePointAt() method.
Both methods return an integer representing the UTF-16 code unit of a character, but only codePointAt() can return the full value of a Unicode value greather 0xFFFF (65535), a number between 0 and 1114111 (0x10FFFF).
charCodeAt() method returns NaN if the given index cannot be found.
myString.charCodeAt(index)