I wrote a long TCP connection socket server in C#. Spike in memory in my server happens. I used dotNet Memory Profiler(a tool) to detect where the memory leaks. Memory Profiler indicates the private heap is huge, and the memory is something like below(the number is not real,what I want to show is the GC0 and GC2's Holes are very very huge, the data size is normal):
Managed heaps - 1,500,000KB
Normal heap - 1400,000KB
Generation #0 - 600,000KB
Data - 100,000KB
"Holes" - 500,000KB
Generation #1 - xxKB
Data - 0KB
"Holes" - xKB
Generation #2 - xxxxxxxxxxxxxKB
Data - 100,000KB
"Holes" - 700,000KB
Large heap - 131072KB
Large heap - 83KB
Overhead/unused - 130989KB
Overhead - 0KB
Howerver, what is GC hole? I read an article about the hole:
http://kaushalp.blogspot.com/2007/04/what-is-gc-hole-and-how-to-create-gc.html
The author said :
The code snippet below is the simplest way to introduce a GC hole into the system.
//OBJECTREF is a typedef for Object*.
{
PointerTable *pTBL = o_pObjectClass->GetPointerTable();
OBJECTREF aObj = AllocateObjectMemory(pTBL);
OBJECTREF bObj = AllocateObjectMemory(pTBL);
//WRONG!!! “aObj” may point to garbage if the second
//“AllocateObjectMemory” triggered a GC.
DoSomething (aOb, bObj);
}
All it does is allocate two managed objects, and then does something with them both.
This code compiles fine, and if you run simple pre-checkin tests, it will probably “work.” But this code will crash eventually.
Why? If the second call to “AllocateObjectMemory” triggers a GC, that GC discards the object instance you just assigned to “aObj”. This code, like all C++ code inside the CLR, is compiled by a non-managed compiler and the GC cannot know that “aObj” holds a root reference to an object you want kept live.
========================================================================
I can't understand what he explained. Does the sample mean aObj becomes a wild pointer after GC?
Is it mean
{
aObj = (*aObj)malloc(sizeof(object));
free(aObj);
function(aObj);?
}
?
I hope somebody can explain it.