calculating a gps coordinate given a point, bearing and distance
Posted
by
user530509
on Stack Overflow
See other posts from Stack Overflow
or by user530509
Published on 2010-12-25T17:15:16Z
Indexed on
2010/12/25
18:54 UTC
Read the original article
Hit count: 282
Hello, I have a problem which draws my back in some project for some time now.
Im basically looking to trap a polygon using x,y points drawn by some script ive written. lat1,lon1 are the center gps cords of the polygon and im looking for its surrounding polygon.
here is a part of my code in python:
def getcords(lat1,lon1,dr,bearing): lat2=asin(sin(lat1)*cos(dr)+cos(lat1)*sin(dr)*cos(bearing)) lon2=lon1+atan2(sin(bearing)*sin(dr)*cos(lat1),cos(dr)-sin(lat1)*sin(lat2)) return [lat2,lon2]
my input goes like this: lat1,lon1 - are given in decimal degrees. -dr is the angular computed by dividing the distance in miles by the earth's -raiuds(=3958.82) -bearing between 0-360 degrees.
however for the input getcorsds1(42.189275,-76.85823,0.5/3958.82,30) i get [-1.3485899508698462, -76.8576637627568], however [42.2516666666667,-76.8097222222222] is the right answer.
as for the angular distance i calculate it simply by dividing the distance in miles by the earth's raiuds(=3958.82).
anybody?
© Stack Overflow or respective owner