what type of bug causes a program to slowly use more processor power and all of a sudden go to 100%?
- by reinier
Hi,
I was hoping to get some good ideas as to what might be causing a really nasty bug.
This is a program which is transmitting data over a socket, and also receives messages back.
I could explain lots more, but I don't think this will help here.
I'm just searching for hypothetical problems which can cause the following behaviour:
program runs
processor time slowly accumulates (till around 60%)
all of a sudden (could be after 30 but also after 60 seconds) the processor time shoots to 100%. the program halts completely
In my syslog it always ends on one line with a memory allocation (something similar to: myArray = new byte[16384]) in the same thread.
now here is the weird part: if I set the debugger anywhere...it immediately stops on that line. So just the act of setting a breakpoint, made the thread continue (it wasn't running since I saw no log output anymore)
I was thinking 'deadlock' but that would not cause 100% processor power. If anything, the opposite. Also, setting a breakpoint would not cause a deadlock to end.
anyone else a theoretical suggestion as to what kind of 'construct' might cause this effect?
(apart from 'bad programming') ;^)
thanks
EDIT:
I just noticed.... by setting the sendspeed slower, the problem shows itself much later than expected. I would think around the same amount of packets send...but no the amount of packets send is much higher this way before it has the same problem.