Strange JavaScript Regular Expression Behavior
Posted
by Kiwi
on Stack Overflow
See other posts from Stack Overflow
or by Kiwi
Published on 2010-06-14T04:12:03Z
Indexed on
2010/06/14
4:22 UTC
Read the original article
Hit count: 280
JavaScript
|regex
I'm getting different behavior from a regular expression in JavaScript depending on whether or not I declare it using literal syntax. Using a extremely simple test HTML file:
<html>
<head>
<script type="text/javascript">
var s = '3';
var regex1 = /\d/;
var regex2 = new RegExp('\d');
alert(s.search(regex1)); // 0 (matches)
alert(s.search(regex2)); // -1 (does not match)
</script>
</head>
<body></body>
</html>
The regular expression declared with literal syntax (/\d/
) works correctly, while the other (new RegExp('\d')
) does not. Why on earth is this happening?
I'm using Google Chrome 5.0.375.70 on Windows Vista Home Premium, if that's at all helpful.
© Stack Overflow or respective owner