To convert a character to ascii code in JavaScript, you can use the “charCodeAt() method or codePointAt() method of string.”
Method 1: Using charCodeAt()
The string charCodeAt() method returns an integer between 0 and 65535, representing the UTF-16 code unit at the given index.
let char = 'K';
let asciiCode = char.charCodeAt(0);
console.log(asciiCode);
Output
75
Method 2: Using codePointAt()
The String.codePointAt() function returns a non-negative integer, the Unicode code point value. It’s similar to charCodeAt() but can handle full 32-bit code points, whereas charCodeAt() only deals with the 16-bit units of UTF-16 encoding.
For characters within the Basic Multilingual Plane (BMP) – those characters with code points between U+0000 and U+FFFF – codePointAt() and charCodeAt() return the same value.
However, for characters outside of the BMP (those with code points between U+010000 and U+10FFFF), codePointAt() will return the correct full code point, whereas charCodeAt() will return a surrogate pair.
For a character inside the BMP:
let char = 'K';
let codePoint = char.codePointAt(0);
console.log(codePoint);
Output
75
For a character outside the BMP:
let char = '𐍈';
let codePoint = char.codePointAt(0);
console.log(codePoint);
Output
66376
In the second example, if you were to use charCodeAt(0) instead, you would get the code unit of the leading surrogate, which wouldn’t represent the actual character’s full code point on its own.
That’s it!
Related posts
JavaScript String to Char Code

Krunal Lathiya is a seasoned Computer Science expert with over eight years in the tech industry. He boasts deep knowledge in Data Science and Machine Learning. Versed in Python, JavaScript, PHP, R, and Golang. Skilled in frameworks like Angular and React and platforms such as Node.js. His expertise spans both front-end and back-end development. His proficiency in the Python language stands as a testament to his versatility and commitment to the craft.