Linux Has Become Very Slow Dealing With Large Data

Posted by Kohjah Breese on Super User See other posts from Super User or by Kohjah Breese
Published on 2014-05-30T20:22:15Z Indexed on 2014/05/30 21:35 UTC
Read the original article Hit count: 170

Filed under:
|
|
|
|

Last year I bought a computer, for around $1,800, so it is relatively high-end. When I first got it I was particularly pleased at how quick it dealt with large MySQL queries, imports and exports. But somewhere along the way something has gone wrong and I am not sure how to diagnose the problem.

Any job that involves processing large amounts of data, e.g. gzipping file c. 1GB+, UPDATEs on large MySQL tables etc. have become very slow. I just performed an intensive alter statement on a 240,000,000 row table on a remote server, which is lower spec. This took about 10 minutes. However, performing the same query on a 167,000,000 row table on my computer went fine until it hit 860MB. Now it is only writing about 1MB every 15 seconds.

Does anyone have any advice as to debugging what the issue is?

I am using LinuxMint (based on Ubuntu 12.04.) The home partition is encrypted, which really slows down gzip. I have noticed the swap is barely used, but am not sure if that is because there is more than enough RAM. The filesystem is ext4. The MySQL server is on a separate hard drive, but it was fine when I first installed it. Other than the above issues, there are no other problems with it.

I am going to install a fresh Ubuntu on the 4th hard drive to see if that is any different.

© Super User or respective owner

Related posts about ubuntu

Related posts about encryption