CPU/JVM/JBoss 7 slows down over time
Posted
by
lukas
on Server Fault
See other posts from Server Fault
or by lukas
Published on 2012-12-03T02:21:37Z
Indexed on
2012/12/03
5:14 UTC
Read the original article
Hit count: 307
I'm experiencing performance slow down on JBoss 7.1.1 Final. I wrote simple program that demostrates this behavior. I generate an array of 100,000 of random integers and run bubble sort on it.
@Model
public class PerformanceTest {
public void proceed() {
long now = System.currentTimeMillis();
int[] arr = new int[100000];
for(int i = 0; i < arr.length; i++) {
arr[i] = (int) (Math.random() * 200000);
}
long now2 = System.currentTimeMillis();
System.out.println((now2 - now) + "ms took to generate array");
now = System.currentTimeMillis();
bubbleSort(arr);
now2 = System.currentTimeMillis();
System.out.println((now2 - now) + "ms took to bubblesort array");
}
public void bubbleSort(int[] arr) {
boolean swapped = true;
int j = 0;
int tmp;
while (swapped) {
swapped = false;
j++;
for (int i = 0; i < arr.length - j; i++) {
if (arr[i] > arr[i + 1]) {
tmp = arr[i];
arr[i] = arr[i + 1];
arr[i + 1] = tmp;
swapped = true;
}
}
}
}
}
Just after I start the server, it takes approximately 22 seconds to run this code. After few days of JBoss 7.1.1. running, it takes 330 sec to run this code. In both cases, I launch the code when the CPU utilization is very low (say, 1%). Any ideas why? I run the server with following arguments:
-Xms1280m -Xmx2048m -XX:MaxPermSize=2048m -Djava.net.preferIPv4Stack=true -Dorg.jboss.resolver.warning=true -Dsun.rmi.dgc.client.gcInterval=3600000 -Dsun.rmi.dgc.server.gcInterval=3600000 -Djboss.modules.system.pkgs=org.jboss.byteman -Djava.awt.headless=true -Duser.timezone=UTC -Djboss.server.default.config=standalone-full.xml -Xrunjdwp:transport=dt_socket,address=8787,server=y,suspend=n
I'm running it on Linux 2.6.32-279.11.1.el6.x86_64 with java version "1.7.0_07".
It's within J2EE applicaiton. I use CDI so I have a button on JSF page that will call method "proceed" on @RequestScoped component PerformanceTest. I deploy this as separate war file and even if I undeploy other applications, it doesn't change the performance.
It's a virtual machine that is sharing CPUs with another machine but that one doesn't consume anything.
Here's yet another observation: when the server is after fresh start and I run the bubble sort, It utilizes 100% of one processor core. It never switches to another core or drops utilization below 95%. However after some time the server is running and I'm experiencing the performance problems, the method above is utilizing CPU core usually 100%, however I just found out from htop that this task is being switched very often to other cores. That is, at the beginning it's running on core #1, after say 2 seconds it's running on #5 then after say 2 seconds #8 etc. Furthermore, the utilization is not kept at 100% at the core but sometimes drops to 80% or even lower.
For the server after fresh start, even though If I simulate a load, it never switches the task to another core.
© Server Fault or respective owner