Happy New year everyone!. Let me kick start the year off by talking about Virtualization density.
What is it?
The number of virtual servers that a physical server can support and it's increase from the prior physical infrastructure as a percentage.
Why is it important?
This is important because the density should be indicative of how well the server is getting consumed?
So what is wrong ?
Virtualization density fails to convey the "Real usage" of a server. Most of the hypervisor based O/S Virtualization evangelists take pride in the fact that they are now running a Virtual Server farm of X machines compared to a Physical server farm of Y (with Y less than X obviously). The real question is - has your utilization of the server really increased or not. In an internal study that was conducted by one of the top financial institution - the utilization of servers only went up by 15% from 30 to 45.
So, this really means that just by increasing virtualization density one will not be achieving the goal of using up the servers in their server farm better.
I will write about what the possible approaches are to increase virtualization density in the next entry.