Why do date manipulation in Java with milliseconds?
Posted
by
staticsan
on Stack Overflow
See other posts from Stack Overflow
or by staticsan
Published on 2011-01-07T05:38:17Z
Indexed on
2011/01/07
5:53 UTC
Read the original article
Hit count: 207
I was recently faced with the problem of calculating the number of days from two dates in Java (without using joda, I'm afraid). Searching on the 'net shows most answers to this question say to get the milliseconds of the two days and convert that to days, which I found appalling. However, a scant few show a different approach: use a temporary variable to count how many times it takes adding 1 day to the first date to get to the second. This leaves the conversions to the code that does it best: the library.
Why do so many people advocate the first?
In another project, I had previously encountered numerous subtle date calculation problems involving time-zones, daylight-saving and once even leap years using seconds to do date comparisions and calculations. All these went away when all the comparison and calculation code was rewitten to use the language libraries. (This was in PHP, though, where the libraries are structured quite differently to Java.) So I'm understandably reluctant to use this "common wisdom" in the world of Java about comparing dates.
© Stack Overflow or respective owner