How to guard against dividing by zero when doing symbolic regression? ECJ
Posted
by Charlie
on Stack Overflow
See other posts from Stack Overflow
or by Charlie
Published on 2010-03-27T17:03:24Z
Indexed on
2010/03/27
17:13 UTC
Read the original article
Hit count: 271
I'm writing a genetic program to perform symbolic regression on a formula. I'm using ECJ. See tutorial 4 for an example of what this is and the base that I started off of.
The problem comes when implementing division as a function to your genetic program. How do you guard against dividing by zero?
© Stack Overflow or respective owner