How/is data shared between fastCGI processes?
Posted
by Josh the Goods
on Stack Overflow
See other posts from Stack Overflow
or by Josh the Goods
Published on 2010-05-12T02:58:42Z
Indexed on
2010/05/12
3:04 UTC
Read the original article
Hit count: 355
I've written a simple perl script that I'm running via fastCGI on Apache. The application loads a set of XML data files which are used to lookup values based upon the the query parameters of an incoming request. As I understand it, if I want to increase the amount of concurrent requests my application can handle I need to allow fastCGI to spawn multiple processes. Will each of these processes have to hold duplicate copies of the XML data in memory? Is there a way to set things up so that I can have one copy of the XML data loaded in memory while increasing the capacity to handle concurrent requests?
© Stack Overflow or respective owner