Java - How to avoid loss of precision during divide and cast to int?
- by David Buckley
I have a situation where I need to find out how many times an int goes into a decimal, but in certain cases, I'm losing precision. Here is the method:
public int test(double decimalAmount, int divisor) {
return (int) (decimalAmount/ (1d / divisor));
}
The problem with this is if I pass in 1.2 as the decimal amount and 5 as the divisor, I get 5 instead of 6. How can I restrusture this so I know how many times 5 goes into the decimal amount as an int?