Java - How to avoid loss of precision during divide and cast to int?

Posted by David Buckley on Stack Overflow See other posts from Stack Overflow or by David Buckley
Published on 2010-04-16T18:22:38Z Indexed on 2010/04/16 18:33 UTC
Read the original article Hit count: 328

Filed under:
|
|
|

I have a situation where I need to find out how many times an int goes into a decimal, but in certain cases, I'm losing precision. Here is the method:

public int test(double decimalAmount, int divisor) {
  return (int) (decimalAmount/ (1d / divisor));
}

The problem with this is if I pass in 1.2 as the decimal amount and 5 as the divisor, I get 5 instead of 6. How can I restrusture this so I know how many times 5 goes into the decimal amount as an int?

© Stack Overflow or respective owner

Related posts about java

Related posts about precision