How to Convert a Character to ASCII Code in JavaScript

To convert a character to ascii code in JavaScript, you can use the “charCodeAt() method or codePointAt() method of string.”

Method 1: Using charCodeAt()

The string charCodeAt() method returns an integer between 0 and 65535, representing the UTF-16 code unit at the given index.

let char = 'K';

let asciiCode = char.charCodeAt(0);

console.log(asciiCode); 

Output

75

Method 2: Using codePointAt()

The String.codePointAt() function returns a non-negative integer, the Unicode code point value. It’s similar to charCodeAt() but can handle full 32-bit code points, whereas charCodeAt() only deals with the 16-bit units of UTF-16 encoding.

For characters within the Basic Multilingual Plane (BMP) – those characters with code points between U+0000 and U+FFFF – codePointAt() and charCodeAt() return the same value.

However, for characters outside of the BMP (those with code points between U+010000 and U+10FFFF), codePointAt() will return the correct full code point, whereas charCodeAt() will return a surrogate pair.

For a character inside the BMP:

let char = 'K';

let codePoint = char.codePointAt(0);

console.log(codePoint);

Output

75

For a character outside the BMP:

let char = '𐍈';

let codePoint = char.codePointAt(0);

console.log(codePoint);

Output

66376

In the second example, if you were to use charCodeAt(0) instead, you would get the code unit of the leading surrogate, which wouldn’t represent the actual character’s full code point on its own.

That’s it!

Related posts

JavaScript String to Char Code

JavaScript String to Boolean

JavaScript String to Date

JavaScript String to Object

JavaScript Date to String

Leave a Comment