Performance variation
- by Ree
During my time spent working with multiple machines, I have noticed that performance of the same machine doing the same tasks in the same order differs and sometimes the difference is big enough to be noticeable. This applies to all the machines I've owned and/or maintained (old and modern). Some examples (many of them you may have noticed yourself) that sometimes are completed in different time frames:
POST
OS installation
Hardware tests and operations (usually executed within a customized OS such as one of the many DOS variants), HDD tests and "low level" formats
Software installation or other tasks (such as benchmarks) within a general purpose OS (Windows, Linux, etc)
I can imagine this is caused by the fact that a machine is built with many components having to communicate as a whole and since the mechanical and electronic parts aren't perfect the overhead occurs. In the last example, I assume the OS complexity and concurrently running multiple processes has some additional effect as well. However, I'm wondering if this hardware imperfection and overhead is indeed that high to be humanly noticeable? Maybe there are other factors that are influencial as much or even more? So, in short - why?
To emphasize: the difference is noticeable on the same machine performing the same tasks and this applies to ANY machine in my experience. I'm not comparing machine to machine performance.