Adding tolerance to a point in polygon test
Posted
by
David Gouveia
on Game Development
See other posts from Game Development
or by David Gouveia
Published on 2012-07-06T05:41:36Z
Indexed on
2012/07/06
9:24 UTC
Read the original article
Hit count: 282
I've been using this method which was taken from Game Coding Complete to detect whether a point is inside of a polygon. It works in almost every case, but is failing on a few edge cases, and I can't figure out the reason.
For example, given a polygon with vertices at (0,0) (0,100) and (100,100), the algorithm is returning:
- True for any point strictly inside the polygon
- False for any of the vertices
- False for (0, 50) which lies on one of the edges of the polygon
- True (?) for (50,50) which is also on one of the edges of the polygon
I'd actually like to relax the algorithm so that it returns true in all of these cases. In other words, it should return true for points that are strictly inside, for the vertices themselves, and for points on the edges of the polygon.
If possible I'd also like to give it enough tolerance so that it always tend towards "true" in face of floating point fluctuations. For example, I have another method, that given a line segment and a point, returns the closest location on the line segment to the given point.
Currently, given any point outside the polygon and one of its edges, there are cases where the result is categorized as being inside by the method above, while other points are considered outside. I'd like to give it enough tolerance so that it always returns true in this situation.
The way I've currently solved the problem is an hack, which consists of using an external library to inflate the polygon by a few pixels, and performing the tests on the inflated polygon, but I'd really like to replace this with a proper solution.
© Game Development or respective owner