Analysing and measuring the performance of a .NET application (survey results)

Posted by Laila on Simple Talk See other posts from Simple Talk or by Laila
Published on Mon, 08 Mar 2010 13:52:15 GMT Indexed on 2010/03/16 17:01 UTC
Read the original article Hit count: 554

Filed under:

Back in December last year, I asked myself: could it be that .NET developers think that you need three days and a PhD to do performance profiling on their code?

What if developers are shunning profilers because they perceive them as too complex to use? If so, then what method do they use to measure and analyse the performance of their .NET applications? Do they even care about performance?

So, a few weeks ago, I decided to get a 1-minute survey up and running in the hopes that some good, hard data would clear the matter up once and for all. I posted the survey on Simple Talk and got help from a few people to promote it. The survey consisted of 3 simple questions:

clip_image002

question2

clip_image006

Amazingly, 533 developers took the time to respond - which means I had enough data to get representative results! So before I go any further, I would like to thank all of you who contributed, because I now have some pretty good answers to the troubling questions I was asking myself. To thank you properly, I thought I would share some of the results with you.

First of all, application performance is indeed important to most of you. In fact, performance is an intrinsic part of the development cycle for a good 40% of you, which is much higher than I had anticipated, I have to admit. (I know, "Have a little faith Laila!")

careperf_graph

When asked what tool you use to measure and analyse application performance, I found that nearly half of the respondents use logging statements, a third use performance counters, and 70% of respondents use a profiler of some sort (a 3rd party performance profilers, the CLR profiler or the Visual Studio profiler).

methodofanalysis_graph

The importance attributed to logging statements did surprise me a little. I am still not sure why somebody would go to the trouble of manually instrumenting code in order to measure its performance, instead of just using a profiler. I personally find the process of annotating code, calculating times from log files, and relating it all back to your source terrifyingly laborious. Not to mention that you then need to remember to turn it all off later! Even when you have logging in place throughout all your code anyway, you still have a fair amount of potentially error-prone calculation to sift through the results; in addition, you'll only get method-level rather than line-level timings, and you won't get timings from any framework or library methods you don't have source for. To top it all, we all know that bottlenecks are rarely where you would expect them to be, so you could be wasting time looking for a performance problem in the wrong place.

On the other hand, profilers do all the work for you: they automatically collect the CPU and wall-clock timings, and present the results from method timing all the way down to individual lines of code. Maybe I'm missing a trick. I would love to know about the types of scenarios where you actively prefer to use logging statements.

Finally, while a third of the respondents didn't have a strong opinion about code performance profilers, those who had an opinion thought that they were mainly complex to use and time consuming. Three respondents in particular summarised this perfectly:

"sometimes, they are rather complex to use, adding an additional time-sink to the process of trying to resolve the existing problem".

"they are simple to use, but the results are hard to understand"

"Complex to find the more advanced things, easy to find some low hanging fruit".

These results confirmed my suspicions: Profilers are seen to be designed for more advanced users who can use them effectively and make sense of the results.

perceptionprofilers_graph

I found yet more interesting information when I started comparing samples of "developers for whom performance is an important part of the dev cycle", with those "to whom performance is only looked at in times of crisis", and "developers to whom performance is not important, as long as the app works". See the three graphs below.

Sample of developers to whom performance is an important part of the dev cycle:

sampledevperfimp_graph

Sample of developers to whom performance is important only in times of crisis:

sampledevperfimpincrisis_graph

Sample of developers to whom performance is not important, as long as the app works:

sampledevperfnotimpaslongasappworks_graph

As you can see, there is a strong correlation between the usage of a profiler and the importance attributed to performance: indeed, the more important performance is to a development team, the more likely they are to use a profiler. In addition, developers to whom performance is an important part of the dev cycle have a higher tendency to use a much wider range of methods for performance measurement and analysis. And, unsurprisingly, the less important performance is, the less varied the methods of measurement are.

So all in all, to come back to my random questions:

.NET developers do care about performance. Those who care the most use a wider range of performance measurement methods than those who care less. But overall, logging statements, performance counters and third party performance profilers are the performance measurement methods of choice for most developers.

Finally, although most of you find code profilers complex to use, those of you who care the most about performance tend to use profilers more than those of you to whom performance is not so important.

© Simple Talk or respective owner