When to trash hashmap contents to avoid performance degradation?

Posted by Jack on Stack Overflow See other posts from Stack Overflow or by Jack
Published on 2010-03-11T16:30:20Z Indexed on 2010/03/11 17:44 UTC
Read the original article Hit count: 233

Filed under:
|
|
|

Hello, I'm woking on Java with a large (millions) hashmap that is actually built with a capacity of 10.000.000 and a load factor of .75 and it's used to cache some values

since cached values become useless with time (not accessed anymore) but I can't remove useless ones while on the way I would like to entirely empty the cache when its performance starts to degrade. How can I decide when it's good to do it?

For example, with 10 millions capacity and .75 should I empty it when it reaches 7.5 millions of elements? Because I tried various threshold values but I would like to have an analytic one.

I've already tested the fact that emping it when it's quite full is a boost for perfomance (first 2-3 algorithm iterations after the wipe just fill it back, then it starts running faster than before the wipe)

Thanks

© Stack Overflow or respective owner

Related posts about java

Related posts about hashmap