Search code examples
javascriptunicodeecmascript-6ecmascript-5

Unicode escape sequence for identifiers in Javascript


Below declaration works,

var \u0061 =2; // a  = 2;

But below declaration give an error,

var \u00A5 = 2; // suppose to be ¥ = 2;

code point 0xA5 is in BMP plane, Why this error?


Solution

  • This has nothing to do with your escape sequence, which is fine. It's just that ¥ is not a valid identifier, in contrast to a. An identifier needs to start with $, _, "any Unicode code point with the Unicode property “ID_Start”", or an escape sequence for one of the previous. ¥, being a currency symbol, is not such character.