MySQL performance - 100Mb ethernet vs 1Gb ethernet

Posted by Rob Penridge on Stack Overflow See other posts from Stack Overflow or by Rob Penridge
Published on 2010-04-16T21:58:41Z Indexed on 2010/04/17 0:33 UTC
Read the original article Hit count: 660

Filed under:
|
|

Hi All

I've just started a new job and noticed that the analysts computers are connected to the network at 100Mbps. The ODBC queries we run against the MySQL server can easily return 500MB+ and it seems at times when the servers are under high load the DBAs kill low priority jobs as they are taking too long to run.

My question is this... How much of this server time is spent executing the request, and how much time is spent returning the data to the client? Could the query speeds be improved by upgrading the network connections to 1Gbps?

(Updated for the why): The database in question was built to accomodate reporting needs and contains massive amounts of data. We usually work with subsets of this data at a granular level in external applications such as SAS or Excel, hence the reason for the large amounts of data being transmitted. The queries are not poorly structured - they are very simple and the appropriate joins/indexes etc are being used. I've removed 'query' from the Title of the post as I realised this question is more to do with general MySQL performance rather than query related performance. I was kind of hoping that someone with a Gigabit connection may be able to actually quantify some results for me here by running a query that returns a decent amount of data, then they could limit their connection speed to 100Mb and rerun the same query. Hopefully this could be done in an environment where loads are reasonably stable so as not to skew the results.

If ethernet speed can improve the situation I wanted some quantifiable results to help argue my case for upgrading the network connections.

Thanks Rob

© Stack Overflow or respective owner

Related posts about mysql

Related posts about ethernet