Why doesn't an octal literal as a string cast to a number?

Posted by Andy E on Stack Overflow See other posts from Stack Overflow or by Andy E
Published on 2010-03-30T18:54:52Z Indexed on 2010/03/30 19:23 UTC
Read the original article Hit count: 335

Filed under:
|

In JavaScript, why does an octal number string cast as a decimal number? I can cast a hex literal string using Number() or +, why not an octal?

For instance:

1000 === +"1000" // -> true
0xFF === +"0xFF" // -> true
0100 === +"0100" // -> false - +"0100" gives 100, not 64

I know I can parse with parseInt("0100" [, 8]), but I'd like to know why casting doesn't work like it does with hex and dec numbers.

Also, does anyone know why octal literals are dropped from ECMAScript 5th Edition in strict mode?

© Stack Overflow or respective owner

Related posts about JavaScript

Related posts about ecmascript