What's the best way to work out if a virtual server is overloaded?
- by zemaj
I have a series of virtual servers. I'm running a command to login to each one and take a look at the load averages using uptime.
What's the best way to work out if load values represent overloading? I'm running on rackspace cloud, so the servers have burst capability and can be all different sizes.
I'm a little stumped on how to come up with a consistent way of figuring out when I need to spin up new servers. I can do things like estimate the jobs running on each one, but I'd like a system that runs a little closer to the real resource use available on each instance, as it obviously varies quite a bit!
Help greatly appreciated!