Why isn't 'Low Fragmentation Heap' LFH enabled by default on Windows Server 2003?
Posted
by
James Wiseman
on Server Fault
See other posts from Server Fault
or by James Wiseman
Published on 2012-11-19T15:59:19Z
Indexed on
2012/11/19
17:02 UTC
Read the original article
Hit count: 310
I've been investigating an issue with a production Classic ASP website running on IIS6 which seems indicative of memory fragmentation.
One of the suggestions of how to ameliorate this came from Stackoverflow: How can I find why some classic asp pages randomly take a real long time to execute?. It suggested flipping a setting in the site's global.asa file to 'turn on' Low Fragmentation Heap (LFH).
The following code (with a registered version of the accompanying DLL) did the trick.
Set LFHObj=CreateObject("TURNONLFH.ObjTurnOnLFH")
LFHObj.TurnOnLFH()
application("TurnOnLFHResult")=CStr(LFHObj.TurnOnLFHResult)
(Really the code isn't that important to the question).
An author of a linked post reported a seemingly magic resolution to this issue, and, reading around a little more, I discovered that this setting is enabled by default on Windows Server 2008.
So, naturally, this left me a little concerned:
- Why is this setting not enabled by default on 2003, or
- If it works in 2008 why have Microsoft not issued a patch to enable it by default on 2003?
I suspect the answer to the above is the same for both (if there is one).
Obviously, we're testing it in a non-production environment, and doing an array of metrics and comparisons to deem if it does help us. But aside from this I'm really just trying to understand if there's any technical reason why we should do this, or if there are any gotchas that we need to be aware of.
© Server Fault or respective owner