String length differs from Javascript to Java code
Posted
by François P.
on Stack Overflow
See other posts from Stack Overflow
or by François P.
Published on 2009-01-20T17:53:53Z
Indexed on
2010/03/08
5:51 UTC
Read the original article
Hit count: 450
I've got a JSP page with a piece of Javascript validation code which limits to a certain amount of characters on submit. I'm using a <textarea>
so I can't simply use a length attribute like in a <input type="text">
.
I use document.getElementById("text").value.length
to get the string length. I'm running Firefox 3.0 on Windows (but I've tested this behavior with IE 6 also). The form gets submitted to a J2EE servlet. In my Java servlet the string length of the parameter is larger than 2000!
I've noticed that this can easily be reproduced by adding carriage returns in the <textarea>
. I've used Firebug to assert the length of the <textare>
and it really is 2000 characters long. On the Java side though, the carriage returns get converted to UNIX style (\r\n
, instead of \n
), thus the string length differs!
Am I missing something obvious here or what ? If not, how would you reliably (cross-platform / browser) make sure that the <textarea>
is limited.
© Stack Overflow or respective owner