JavaScript String fromCharCode() method
returns a string created from a sequence of UTF-16 code units

String.fromCharCode() static method returns a string created from a sequence of UTF-16 code unit values.

syntax

String.fromCharCode(value1, value2, /* …, */ valueN)

value1, …, valueN A number between 0 and 65535 (0xFFFF) inclusive, representing a UTF-16 code unit.

Numbers greater than 0xFFFF are truncated to the last 16 bits. No validity checks are performed.

String.fromCharCode() method converts UTF-16 code unit values to characters.

String.fromCharCode() method does not change the value of the original string.

Unicode values are UTF-16 values (16-bit integers between 0 and 65535) that are converted to characters and can be concatenated together to form a string.

The first 0 to 31 characters are control characters.

Use the String.fromCodePoint() method if any of the Unicode values you wish to convert are not representable in a single UTF-16 code unit.

syntax 1 description
return a string created from a sequence of UTF-16 code units
select value(s) from from-down list above
String.fromCharCode(value1, value2, ..., valueN)
syntax 2 description
return a string created from a sequence of UTF-16 code units
enter value(s) into myValue below | must be numbers, seperated by a comma
String.fromCharCode(value1, value2, ..., valueN)