Okay, here goes.
I have a camera at (Xc, Yc, Zc.) The Xc and Yc coordinates are latitude/longitude, and the Zc coordinate is an altitude in metres. I have a point at (Xp, Yp, Zp) and a field of view on the camera (Th1, Th2) - where Th1 is horizontal FOV and Th2 is vertical FOV.
Given this information, I'd like to:
test if the point is visible (i.e. in the camera's FOV)
project the point as the camera would see it
I've figured out already that the camera's horizontal view at any given distance is tan(Th1) * distance, but I don't know how to test if the point is visible.
Accuracy is not critical. I would prefer a simple solution over a complicated solution, if it works well enough. The computations will be performed by a small microcontroller, which isn't very fast at things like trig functions.
P.S. this is not homework, I'm doing this for some game development. It will be integrated with the real world, hence the latitude/longitude/altitude. It involves flying real RC planes through virtual hoops (or chasing virtual targets), so I have to project the positions of these hoops on a display.